<aside>
🚧
未來會將相關內容移到本頁
</aside>
<aside>
💡
相關頁面: AI相關工具非常非常的多,RAG開發工具會整理在RAG開發工具,LLM以及相關工具會整理在Large Language Models,Graph RAG開發工具會整理在Graph RAG,其他工具整理在Generative AI開發工具中。目前還有一些資料在Generative AI及RAG中,將逐步整理這些內容。
</aside>
簡介
- Large Language Model (LLM) Stack — Version 6
- Models & Hubs
- Data, Ops & Monitoring
- LLM Application Development Tools
- End User Applications
- Tech Stack For Production-Ready LLM Applications In 2024 (Friend Link)
- LLM API and Self Hosted Options
- RAG Databases
- ChromaDB
- Supabase with PGVector
- Agents
- Observability
- Backend
- Python, FastAPI and Pydantic
- Deployment
- Understanding the AI Stack In the Era of Generative AI
- Programming language: The language used to develop the components of the stack, including integration code and source code of the AI application.
- Model provider: Organizations that provide access to foundation models via inference endpoints or other means. Embedding and foundation models are typical models used in generative AI applications.
- LLM orchestrator and framework: A library that abstracts the complexities of integrating components of modern AI applications by providing methods and integration packages. Operators within these components also provide tooling to create, modify, and manipulate prompts and condition LLMs for different purposes.
- Vector database: A data storage solution for vector embeddings. Operators within this component provide features that help manage, store, and efficiently search through vector embeddings.
- Operational database: A data storage solution for transactional and operational data.
- Monitoring and evaluation tool: Tools for tracking AI model performance and reliability, offering analytics and alerts to improve AI applications.
- Deployment solution: Services that enable easy AI model deployment, managing scaling and integration with existing infrastructure.
- What Goes Into AI? Exploring the GenAI Technology Stack
- End Application Builders
- AI Model Builders
- Cloud Service Providers & Data Center Operators
- Microchip Designers
- Microchip Manufacturers (Foundries)
- Silicon and Metal Miners
- Every AI Engineer Should Know these 15 Python Libraries in 2025 (Friend Link)
- Getting Started with Project Setup
- Backend Components
- Data Management
- PostgreSQL and MongoDB
- SQLAlchemy
- Alembic
- AI Integration
- OpenAI, Anthropic, and Google APIs
- Instructor
- LangChain and LlamaIndex
- Vector Databases
- Pinecone, Weaviate, and PGVector
- Observability
- Specialized Tools for Advanced Needs
- DSPy
- PyMuPDF and PyPDF2
- Jinja
- Langchain vs Huggingface (Friend Link)
- One of the standout features of LangChain is its modular design. It offers a range of pre-built modules that handle different aspects of NLP, from tokenization to model training and evaluation.
- Hugging Face excels in providing pre-trained models that can be easily fine-tuned for specific tasks. Their transformers library includes models like BERT, GPT-3, and T5, which you can use out of the box.
開發指引
語言
目前AI的開發語言是以Python為最主要的語言,但是,越來越多框加也都有Javascript的版本了。
GenAIScript
Javascript
Python