Switching to Local LLM Setup

This project isn't over, but I am looking at what it would take to set up the same "multi-modal insta-MARC" functionality with an offline setup.

To me this seems like the more interesting use case. I think I've already proved that yes, adding the MARC21 rulebook in an RAG setup can produce better results, but what's the point if your Chatbot is basically doing a Bing search with ISBN numbers? Copy-cataloging has been around for a long time (and perhaps catalogers would be better suited creating custom GPTs based on their own local practices or integrating with a tool like MarcEdit). 

I'm also not seeing any librarians looking seriously at local LLMs as an option...c'mon, Chat with RTX is right there! Mixtral! Gemma! Don't put all of your eggs in one basket!

Comments

Popular posts from this blog

The 5 Laws of Library Science but for Generative AI

Book Publishing as a "Synthetic Data" Source (with LoC as Source of Truth)