├── LICENSE ├── README.md ├── 大模型推荐.md ├── 大语言模型是什么?.md ├── 没有卡的条件下我们能做什么?.md ├── 论文列表.md └── 论文解读 ├── pic ├── 3种note.png ├── Agent Instructions生成过程.png ├── AutoGen实验.png ├── AutoGen框架.png ├── BBH_result.png ├── BFS与DFS算法结合ToT.png ├── CoN prompt.png ├── CoT and SG CoT.png ├── CoT approaches.png ├── CoT formulations.png ├── CoT_Agent_fw.png ├── HELM任务分类.png ├── HKFR_Experimental.png ├── HKFR_key.png ├── LATS整体框架.png ├── LATS迭代过程.png ├── MCTS_PromptAgent.png ├── MedPrompt 算法.png ├── MedPrompt_SOTA.png ├── MedPrompt图解.png ├── MedPrompt表现.png ├── MedPrompt跨域泛化.png ├── NLU_result.png ├── QA性能.png ├── RAG Components.png ├── RAG Framework.png ├── RAG vs Self-RAG.png ├── RAG vs 其他.png ├── RAG vs 微调.png ├── RAG时间线.png ├── RALM vs RALM+CoN.png ├── RAP思维框架.png ├── Reflexion框架.png ├── Reflexion过程.png ├── STP.png ├── STP_result.png ├── Self-RAG tokens.png ├── Self-generate CoT tem.png ├── Self_RAG 消融.png ├── Self_RAG 算法.png ├── Self_RAG 结果.png ├── ToT框架与其他三种方法.png ├── UCT公式.png ├── Winning rate (%) between zeroshot, zero-shot CoT, and zero-shot AgentInstruct.png ├── Zero-shot Agent instructions.png ├── agent研究总结.png ├── collm_data.png ├── collm_key.png ├── collm_warm_cold.png ├── example_PromptAgent.png ├── expert_prompt.png ├── kgllm01.png ├── kgllm02.png ├── kgllm03.png ├── kgllm04.png ├── kgllm05.png ├── kgllm06.png ├── kgllm07.png ├── kgllm08.png ├── kgllm09.png ├── kgllm10.png ├── kgllm11.png ├── kgllm12.png ├── kgllm13.png ├── kgllm14.png ├── kgllm15.png ├── kgllm16.png ├── kgllm17.png ├── latm01.png ├── latm02.png ├── latm03.png ├── latm04.png ├── latm05.png ├── llara01.png ├── llara02.png ├── llara03.png ├── llara04.png ├── llara05.png ├── llara06.png ├── llara07.png ├── llara08.png ├── metagpt01.png ├── metagpt02.png ├── metagpt03.png ├── p_m_r_CoT.png ├── rs_vs_llmrs.png ├── scrl01.png ├── scrl02.png ├── wizardlm1.png ├── wizardlm2.png ├── 与SC对比.png ├── 与few-shot对比.png ├── 代理各方面应用.png ├── 代理架构模块.png ├── 代理能力获取方法.png ├── 六种AutoGen框架的应用.png ├── 噪声鲁棒性.png ├── 多模型对比.png ├── 未知鲁棒性.png ├── 消融实验结果.png └── 蒙特卡洛树规划.png ├── 大模型+推荐系统 ├── CoLLM Integrating Collaborative Embeddings into Large Language Models for Recommendation(中科大).md ├── LLaRA Aligning Large Language Models with Sequential Recommenders.md ├── RecSys2023Heterogeneous Knowledge Fusion A Novel Approach for Personalized Recommendation via LLM.md └── RecSys2023:Large Language Models for Generative Recommendation A Survey and Visionary Discussions(LLM推荐系统综述).md ├── 大模型+知识图谱 └── Unifying Large Language Models and Knowledge Graphs A Roadmap(大模型+知识图谱综述).md ├── 大模型+金融 └── Integrating Stock Features and Global Information via Large Language Models for Enhanced Stock Return Prediction.md └── 大模型 ├── A Survey on Large Language Model based Autonomous Agents(基于LLM的自主智能体的综述).md ├── Agent Instructs Large Language Models to be General Zero-Shot Reasoners.md ├── AutoGen Enabling Next-Gen LLM Applications via Multi-Agent Conversation.md ├── Can Generalist Foundation Models Outcompete Special-Purpose Tuning Case Study in Medicine.md ├── Chain-of-Note Enhancing Robustness in Retrieval-Augmented Language Models.md ├── Igniting Language Intelligence The Hitchhiker's Guide From Chain-of-Thought Reasoning to Language Agents.md ├── LANGUAGE AGENT TREE SEARCH UNIFIES REASON-ING ACTING AND PLANNING IN LANGUAGE MODELS.md ├── Large Language Models as Tool Makers.md ├── METAGPT META PROGRAMMING FOR AMULTI-AGENT COLLABORATIVE FRAMEWORK.md ├── PromptAgent Strategic Planning with Language Models Enables Expert-level Prompt Optimization.md ├── Reasoning with Language Model is Planning with World Model.md ├── Reflexion Language Agents with Verbal Reinforcement Learning.md ├── Retrieval-Augmented Generation for Large Language Models A Survey(RAG综述).md ├── Self-RAG Learning to Retrieve, Generate, and Critique through Self-Reflection.md ├── Take a Step Back Evoking Reasoning via Abstraction in Large Language Models.md ├── Tree of Thoughts Deliberate Problem Solving with Large Language Models.md └── WizardLM Empowering Large Language Models to Follow Complex Instructions.md /LICENSE: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/LICENSE -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/README.md -------------------------------------------------------------------------------- /大模型推荐.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/大模型推荐.md -------------------------------------------------------------------------------- /大语言模型是什么?.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/大语言模型是什么?.md -------------------------------------------------------------------------------- /没有卡的条件下我们能做什么?.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/没有卡的条件下我们能做什么?.md -------------------------------------------------------------------------------- /论文列表.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文列表.md -------------------------------------------------------------------------------- /论文解读/pic/3种note.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/3种note.png -------------------------------------------------------------------------------- /论文解读/pic/Agent Instructions生成过程.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Agent Instructions生成过程.png -------------------------------------------------------------------------------- /论文解读/pic/AutoGen实验.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/AutoGen实验.png -------------------------------------------------------------------------------- /论文解读/pic/AutoGen框架.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/AutoGen框架.png -------------------------------------------------------------------------------- /论文解读/pic/BBH_result.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/BBH_result.png -------------------------------------------------------------------------------- /论文解读/pic/BFS与DFS算法结合ToT.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/BFS与DFS算法结合ToT.png -------------------------------------------------------------------------------- /论文解读/pic/CoN prompt.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/CoN prompt.png -------------------------------------------------------------------------------- /论文解读/pic/CoT and SG CoT.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/CoT and SG CoT.png -------------------------------------------------------------------------------- /论文解读/pic/CoT approaches.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/CoT approaches.png -------------------------------------------------------------------------------- /论文解读/pic/CoT formulations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/CoT formulations.png -------------------------------------------------------------------------------- /论文解读/pic/CoT_Agent_fw.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/CoT_Agent_fw.png -------------------------------------------------------------------------------- /论文解读/pic/HELM任务分类.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/HELM任务分类.png -------------------------------------------------------------------------------- /论文解读/pic/HKFR_Experimental.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/HKFR_Experimental.png -------------------------------------------------------------------------------- /论文解读/pic/HKFR_key.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/HKFR_key.png -------------------------------------------------------------------------------- /论文解读/pic/LATS整体框架.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/LATS整体框架.png -------------------------------------------------------------------------------- /论文解读/pic/LATS迭代过程.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/LATS迭代过程.png -------------------------------------------------------------------------------- /论文解读/pic/MCTS_PromptAgent.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/MCTS_PromptAgent.png -------------------------------------------------------------------------------- /论文解读/pic/MedPrompt 算法.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/MedPrompt 算法.png -------------------------------------------------------------------------------- /论文解读/pic/MedPrompt_SOTA.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/MedPrompt_SOTA.png -------------------------------------------------------------------------------- /论文解读/pic/MedPrompt图解.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/MedPrompt图解.png -------------------------------------------------------------------------------- /论文解读/pic/MedPrompt表现.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/MedPrompt表现.png -------------------------------------------------------------------------------- /论文解读/pic/MedPrompt跨域泛化.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/MedPrompt跨域泛化.png -------------------------------------------------------------------------------- /论文解读/pic/NLU_result.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/NLU_result.png -------------------------------------------------------------------------------- /论文解读/pic/QA性能.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/QA性能.png -------------------------------------------------------------------------------- /论文解读/pic/RAG Components.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/RAG Components.png -------------------------------------------------------------------------------- /论文解读/pic/RAG Framework.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/RAG Framework.png -------------------------------------------------------------------------------- /论文解读/pic/RAG vs Self-RAG.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/RAG vs Self-RAG.png -------------------------------------------------------------------------------- /论文解读/pic/RAG vs 其他.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/RAG vs 其他.png -------------------------------------------------------------------------------- /论文解读/pic/RAG vs 微调.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/RAG vs 微调.png -------------------------------------------------------------------------------- /论文解读/pic/RAG时间线.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/RAG时间线.png -------------------------------------------------------------------------------- /论文解读/pic/RALM vs RALM+CoN.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/RALM vs RALM+CoN.png -------------------------------------------------------------------------------- /论文解读/pic/RAP思维框架.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/RAP思维框架.png -------------------------------------------------------------------------------- /论文解读/pic/Reflexion框架.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Reflexion框架.png -------------------------------------------------------------------------------- /论文解读/pic/Reflexion过程.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Reflexion过程.png -------------------------------------------------------------------------------- /论文解读/pic/STP.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/STP.png -------------------------------------------------------------------------------- /论文解读/pic/STP_result.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/STP_result.png -------------------------------------------------------------------------------- /论文解读/pic/Self-RAG tokens.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Self-RAG tokens.png -------------------------------------------------------------------------------- /论文解读/pic/Self-generate CoT tem.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Self-generate CoT tem.png -------------------------------------------------------------------------------- /论文解读/pic/Self_RAG 消融.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Self_RAG 消融.png -------------------------------------------------------------------------------- /论文解读/pic/Self_RAG 算法.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Self_RAG 算法.png -------------------------------------------------------------------------------- /论文解读/pic/Self_RAG 结果.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Self_RAG 结果.png -------------------------------------------------------------------------------- /论文解读/pic/ToT框架与其他三种方法.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/ToT框架与其他三种方法.png -------------------------------------------------------------------------------- /论文解读/pic/UCT公式.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/UCT公式.png -------------------------------------------------------------------------------- /论文解读/pic/Winning rate (%) between zeroshot, zero-shot CoT, and zero-shot AgentInstruct.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Winning rate (%) between zeroshot, zero-shot CoT, and zero-shot AgentInstruct.png -------------------------------------------------------------------------------- /论文解读/pic/Zero-shot Agent instructions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/Zero-shot Agent instructions.png -------------------------------------------------------------------------------- /论文解读/pic/agent研究总结.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/agent研究总结.png -------------------------------------------------------------------------------- /论文解读/pic/collm_data.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/collm_data.png -------------------------------------------------------------------------------- /论文解读/pic/collm_key.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/collm_key.png -------------------------------------------------------------------------------- /论文解读/pic/collm_warm_cold.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/collm_warm_cold.png -------------------------------------------------------------------------------- /论文解读/pic/example_PromptAgent.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/example_PromptAgent.png -------------------------------------------------------------------------------- /论文解读/pic/expert_prompt.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/expert_prompt.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm01.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm02.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm03.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm03.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm04.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm04.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm05.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm05.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm06.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm06.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm07.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm07.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm08.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm08.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm09.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm09.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm10.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm11.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm12.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm13.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm13.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm14.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm15.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm15.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm16.png -------------------------------------------------------------------------------- /论文解读/pic/kgllm17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/kgllm17.png -------------------------------------------------------------------------------- /论文解读/pic/latm01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/latm01.png -------------------------------------------------------------------------------- /论文解读/pic/latm02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/latm02.png -------------------------------------------------------------------------------- /论文解读/pic/latm03.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/latm03.png -------------------------------------------------------------------------------- /论文解读/pic/latm04.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/latm04.png -------------------------------------------------------------------------------- /论文解读/pic/latm05.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/latm05.png -------------------------------------------------------------------------------- /论文解读/pic/llara01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/llara01.png -------------------------------------------------------------------------------- /论文解读/pic/llara02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/llara02.png -------------------------------------------------------------------------------- /论文解读/pic/llara03.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/llara03.png -------------------------------------------------------------------------------- /论文解读/pic/llara04.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/llara04.png -------------------------------------------------------------------------------- /论文解读/pic/llara05.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/llara05.png -------------------------------------------------------------------------------- /论文解读/pic/llara06.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/llara06.png -------------------------------------------------------------------------------- /论文解读/pic/llara07.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/llara07.png -------------------------------------------------------------------------------- /论文解读/pic/llara08.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/llara08.png -------------------------------------------------------------------------------- /论文解读/pic/metagpt01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/metagpt01.png -------------------------------------------------------------------------------- /论文解读/pic/metagpt02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/metagpt02.png -------------------------------------------------------------------------------- /论文解读/pic/metagpt03.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/metagpt03.png -------------------------------------------------------------------------------- /论文解读/pic/p_m_r_CoT.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/p_m_r_CoT.png -------------------------------------------------------------------------------- /论文解读/pic/rs_vs_llmrs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/rs_vs_llmrs.png -------------------------------------------------------------------------------- /论文解读/pic/scrl01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/scrl01.png -------------------------------------------------------------------------------- /论文解读/pic/scrl02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/scrl02.png -------------------------------------------------------------------------------- /论文解读/pic/wizardlm1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/wizardlm1.png -------------------------------------------------------------------------------- /论文解读/pic/wizardlm2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/wizardlm2.png -------------------------------------------------------------------------------- /论文解读/pic/与SC对比.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/与SC对比.png -------------------------------------------------------------------------------- /论文解读/pic/与few-shot对比.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/与few-shot对比.png -------------------------------------------------------------------------------- /论文解读/pic/代理各方面应用.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/代理各方面应用.png -------------------------------------------------------------------------------- /论文解读/pic/代理架构模块.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/代理架构模块.png -------------------------------------------------------------------------------- /论文解读/pic/代理能力获取方法.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/代理能力获取方法.png -------------------------------------------------------------------------------- /论文解读/pic/六种AutoGen框架的应用.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/六种AutoGen框架的应用.png -------------------------------------------------------------------------------- /论文解读/pic/噪声鲁棒性.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/噪声鲁棒性.png -------------------------------------------------------------------------------- /论文解读/pic/多模型对比.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/多模型对比.png -------------------------------------------------------------------------------- /论文解读/pic/未知鲁棒性.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/未知鲁棒性.png -------------------------------------------------------------------------------- /论文解读/pic/消融实验结果.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/消融实验结果.png -------------------------------------------------------------------------------- /论文解读/pic/蒙特卡洛树规划.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/pic/蒙特卡洛树规划.png -------------------------------------------------------------------------------- /论文解读/大模型+推荐系统/CoLLM Integrating Collaborative Embeddings into Large Language Models for Recommendation(中科大).md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型+推荐系统/CoLLM Integrating Collaborative Embeddings into Large Language Models for Recommendation(中科大).md -------------------------------------------------------------------------------- /论文解读/大模型+推荐系统/LLaRA Aligning Large Language Models with Sequential Recommenders.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型+推荐系统/LLaRA Aligning Large Language Models with Sequential Recommenders.md -------------------------------------------------------------------------------- /论文解读/大模型+推荐系统/RecSys2023Heterogeneous Knowledge Fusion A Novel Approach for Personalized Recommendation via LLM.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型+推荐系统/RecSys2023Heterogeneous Knowledge Fusion A Novel Approach for Personalized Recommendation via LLM.md -------------------------------------------------------------------------------- /论文解读/大模型+推荐系统/RecSys2023:Large Language Models for Generative Recommendation A Survey and Visionary Discussions(LLM推荐系统综述).md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型+推荐系统/RecSys2023:Large Language Models for Generative Recommendation A Survey and Visionary Discussions(LLM推荐系统综述).md -------------------------------------------------------------------------------- /论文解读/大模型+知识图谱/Unifying Large Language Models and Knowledge Graphs A Roadmap(大模型+知识图谱综述).md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型+知识图谱/Unifying Large Language Models and Knowledge Graphs A Roadmap(大模型+知识图谱综述).md -------------------------------------------------------------------------------- /论文解读/大模型+金融/Integrating Stock Features and Global Information via Large Language Models for Enhanced Stock Return Prediction.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型+金融/Integrating Stock Features and Global Information via Large Language Models for Enhanced Stock Return Prediction.md -------------------------------------------------------------------------------- /论文解读/大模型/A Survey on Large Language Model based Autonomous Agents(基于LLM的自主智能体的综述).md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/A Survey on Large Language Model based Autonomous Agents(基于LLM的自主智能体的综述).md -------------------------------------------------------------------------------- /论文解读/大模型/Agent Instructs Large Language Models to be General Zero-Shot Reasoners.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Agent Instructs Large Language Models to be General Zero-Shot Reasoners.md -------------------------------------------------------------------------------- /论文解读/大模型/AutoGen Enabling Next-Gen LLM Applications via Multi-Agent Conversation.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/AutoGen Enabling Next-Gen LLM Applications via Multi-Agent Conversation.md -------------------------------------------------------------------------------- /论文解读/大模型/Can Generalist Foundation Models Outcompete Special-Purpose Tuning Case Study in Medicine.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Can Generalist Foundation Models Outcompete Special-Purpose Tuning Case Study in Medicine.md -------------------------------------------------------------------------------- /论文解读/大模型/Chain-of-Note Enhancing Robustness in Retrieval-Augmented Language Models.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Chain-of-Note Enhancing Robustness in Retrieval-Augmented Language Models.md -------------------------------------------------------------------------------- /论文解读/大模型/Igniting Language Intelligence The Hitchhiker's Guide From Chain-of-Thought Reasoning to Language Agents.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Igniting Language Intelligence The Hitchhiker's Guide From Chain-of-Thought Reasoning to Language Agents.md -------------------------------------------------------------------------------- /论文解读/大模型/LANGUAGE AGENT TREE SEARCH UNIFIES REASON-ING ACTING AND PLANNING IN LANGUAGE MODELS.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/LANGUAGE AGENT TREE SEARCH UNIFIES REASON-ING ACTING AND PLANNING IN LANGUAGE MODELS.md -------------------------------------------------------------------------------- /论文解读/大模型/Large Language Models as Tool Makers.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Large Language Models as Tool Makers.md -------------------------------------------------------------------------------- /论文解读/大模型/METAGPT META PROGRAMMING FOR AMULTI-AGENT COLLABORATIVE FRAMEWORK.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/METAGPT META PROGRAMMING FOR AMULTI-AGENT COLLABORATIVE FRAMEWORK.md -------------------------------------------------------------------------------- /论文解读/大模型/PromptAgent Strategic Planning with Language Models Enables Expert-level Prompt Optimization.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/PromptAgent Strategic Planning with Language Models Enables Expert-level Prompt Optimization.md -------------------------------------------------------------------------------- /论文解读/大模型/Reasoning with Language Model is Planning with World Model.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Reasoning with Language Model is Planning with World Model.md -------------------------------------------------------------------------------- /论文解读/大模型/Reflexion Language Agents with Verbal Reinforcement Learning.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Reflexion Language Agents with Verbal Reinforcement Learning.md -------------------------------------------------------------------------------- /论文解读/大模型/Retrieval-Augmented Generation for Large Language Models A Survey(RAG综述).md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Retrieval-Augmented Generation for Large Language Models A Survey(RAG综述).md -------------------------------------------------------------------------------- /论文解读/大模型/Self-RAG Learning to Retrieve, Generate, and Critique through Self-Reflection.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Self-RAG Learning to Retrieve, Generate, and Critique through Self-Reflection.md -------------------------------------------------------------------------------- /论文解读/大模型/Take a Step Back Evoking Reasoning via Abstraction in Large Language Models.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Take a Step Back Evoking Reasoning via Abstraction in Large Language Models.md -------------------------------------------------------------------------------- /论文解读/大模型/Tree of Thoughts Deliberate Problem Solving with Large Language Models.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/Tree of Thoughts Deliberate Problem Solving with Large Language Models.md -------------------------------------------------------------------------------- /论文解读/大模型/WizardLM Empowering Large Language Models to Follow Complex Instructions.md: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/XingYu-Zhong/LLMsStudy/HEAD/论文解读/大模型/WizardLM Empowering Large Language Models to Follow Complex Instructions.md --------------------------------------------------------------------------------