Posts

Showing posts from January, 2024

The 5 Laws of Library Science but for Generative AI

For reference, here's a wiki about the original 5 laws proposed by S. R. Ranganathan in 1931 . And now the revision: Adaptive Purpose : AI should dynamically serve specific user needs. Inclusive Personalization : AI must cater to diverse users respectfully. Ethical Insight : AI responses should be accurate, context-aware, and ethically informed. Efficient Utility : AI must provide quick and clear answers to maximize user time. Continual Growth : AI should evolve continuously, learning from interactions and data. So I had a convo w/ GPT-4 about how to revise these and make them more concise. Weird how they almost sound like custom instructions, no? We basically have #1-4 already in place, and #5 is very close to being real w/ things like MemGPT already out there .  (I haven't read much about how memory management works w/ Generative AI but my guess is that we'll quickly need better garbage collection ...and certain apps will probably need to avoid long-term memory entirely)

Book Publishing as a "Synthetic Data" Source (with LoC as Source of Truth)

"Your idea of using Large Language Models (LLMs) to catalog new books and then cross-referencing this with actual cataloging done by the Library of Congress is quite innovative and could have several benefits in the realm of data management and library science. Let’s explore this idea in more detail: Automated Cataloging with LLMs : LLMs can process and understand large volumes of text efficiently. They could be trained to read new books and extract key information such as titles, authors, themes, genres, and other metadata. This could significantly speed up the cataloging process. Quality and Consistency Control : By cross-referencing with the Library of Congress catalog, which is a standard in bibliographic data, you can ensure that the data generated by the LLM is accurate and consistent with established cataloging practices. Handling Large Volumes of Books: With the sheer volume of books published every year, LLMs can provide a scalable solution to catalog new books quickly a...