feat(llm): support semi-automated prompt generation#281
Merged
imbajin merged 12 commits intoapache:mainfrom Jul 1, 2025
Merged
feat(llm): support semi-automated prompt generation#281imbajin merged 12 commits intoapache:mainfrom
imbajin merged 12 commits intoapache:mainfrom
Conversation
…er to read and maintain, fix code style
Reordered and deduplicated imports for consistency in base_prompt_config.py, vector_graph_block.py, and prompt_generate.py. Made minor code style improvements and fixed typos in comments to enhance readability and maintainability.
imbajin
approved these changes
Jul 1, 2025
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Overview
This PR aims to introduce a semi-automated Prompt generation feature #253. This functionality leverages the capabilities of large language models (LLMs), allowing users to generate a highly customized Prompt for knowledge graph extraction by simply providing the original text, a description of the desired scenario, and selecting one example from a few system-provided examples that best matches their scenario. The generated Prompt serves as a reference for users, reducing the difficulty for them to get started.
Main Changes
New Operator:
prompt_generate.pyPromptGenerateoperator, which is responsible for constructing the meta-prompt by combining user inputs with a selected few-shot example, invoking the LLM, and returning the generated prompt.Built-in Prompt Examples & Configuration:
prompt_examples.json, a centralized file to store multiple, high-quality, structured few-shot examples for different domains.prompt_config.pywith a new meta-prompt template (generate_extract_prompt_template) designed to guide the LLM in this generation task.UI & Workflow Updates:
vector_graph_block.pyto add a collapsible "Assist in generating graph extraction prompts" section in the “Build RAG Index” tab.demo.load()event to ensure the preview for the default example is displayed on initial page load.Sequence Diagram
sequenceDiagram participant User participant UI as Gradio UI participant PromptGen as PromptGenerate participant LLM as LLM后端 participant Resource as 示例资源 User->>UI: 选择示例、输入场景与文本,点击生成提示 UI->>PromptGen: 调用 generate_prompt_for_ui() PromptGen->>Resource: 加载 few-shot 示例 PromptGen->>LLM: 组装元提示,调用 LLM.generate() LLM-->>PromptGen: 返回生成的抽取提示 PromptGen-->>UI: 返回生成的抽取提示 UI-->>User: 展示生成的抽取提示New UI interface features
New interface location
Expected scenario/direction and few-shot
prompt generate by LLM