MCQ Generator

I need to get the public YT video transcription as an agent in LangChain.

As a second agent, I will create MCQ questions based on the first agent's transcribed output

To achieve this, you can create two agents in LangChain. Here's how you can approach it:

Agent 1: Transcription Agent

  1. Fetch Video Transcription:

    • Use the YouTube Transcript API to fetch the transcription of the public YouTube video.

    • Install the library:

      pip install youtube-transcript-api
    • Fetch transcription:

      from youtube_transcript_api import YouTubeTranscriptApi
      
      def fetch_transcription(video_url):
          video_id = video_url.split("v=")[1].split("&")[0]  # Extract video ID
          transcript = YouTubeTranscriptApi.get_transcript(video_id)
          return " ".join([t['text'] for t in transcript])
      
      video_url = "https://youtu.be/bq1Plo2RhYI?si=ALig8huv6lfTn_Mb"
      transcription = fetch_transcription(video_url)
      print(transcription)
  2. Integrate with LangChain:

    • Create an agent in LangChain to fetch the transcription.

    from langchain.agents import initialize_agent, Tool
    from langchain.tools import tool
    from langchain.chat_models import ChatOpenAI
    
    @tool
    def get_youtube_transcription(video_url: str) -> str:
        return fetch_transcription(video_url)
    
    tools = [Tool(name="TranscriptionTool", func=get_youtube_transcription)]
    llm = ChatOpenAI(temperature=0)
    agent = initialize_agent(tools, llm, agent="zero-shot-react-description")

Agent 2: MCQ Generator Agent

  1. Generate MCQs:

    • Use the transcription as input to generate MCQs.

    • Integrate a prompt that uses a generative LLM like OpenAI GPT to create questions.

  2. Set up the MCQ generation agent:

Integration Workflow

To run both agents sequentially:

Results

  • The first agent fetches the transcription from the YouTube video.

  • The second agent uses the transcription to generate MCQs.

You can further refine the LLM prompts and add post-processing if needed.

Last updated