
Learning Analytics and AI
AI Essay Tutor
Date
Team
2024
Individual work
My Role
AI Designer
Target Users / Audience
Students
Tools Used
Python, Visual Studio Code, Groq, Langchain, Chainlit
Overview
This project creates an interactive AI essay tutor that guides users in improving their writing step-by-step using a large language model from Groq. It utilizes Langchain to structure the interaction with the AI and Chainlit to build the chat interface.
This code initializes an AI essay tutor using Langchain and Groq for the AI model, and Chainlit for the user interface.
When a chat session starts (@cl.on_chat_start), it sets up the Groq model, defines a prompt instructing the AI to act as a step-by-step essay tutor, and integrates chat history management using RunnableWithMessageHistory.
Here is the system prompt used: “Act as knowledgeable essay tutor. Students will ask about how to improve their essay writing, you should guide them one step at a time without providing a direct answers, but help them get closer to the answer step-by-step. Do not answer everything in just one answer. Always remember the original prompt to provide correct guidance.”
When the user sends a message (@cl.on_message), the code retrieves the processing chain and passes the user's question to it. The AI's response is streamed back to the user in chunks for a more interactive experience. The chat history for each session is managed to provide context for subsequent turns in the conversation.
Below is the full version of the code:
from langchain_groq import ChatGroq
from langchain.prompts import ChatPromptTemplate
from langchain.schema import StrOutputParser
from langchain_core.chat_history import InMemoryChatMessageHistory
from langchain_core.runnables.history import RunnableWithMessageHistory
import chainlit as cl
# Dictionary to store chat session history
session_store = {}
def get_or_create_session_history(session_id: str):
"""Retrieve or initialize a session's chat history."""
if session_id not in session_store:
session_store[session_id] = InMemoryChatMessageHistory()
return session_store[session_id]
@cl.on_chat_start
async def on_chat_start():
"""Initialize the chatbot on chat start."""
model = ChatGroq(model="mixtral-8x7b-32768", streaming=True)
prompt_template = ChatPromptTemplate.from_messages(
[
(
"system",
""""Act as knowledgeable essay tutor. Students will ask about how to improve their essay writing, you should guide them one step at a time
without providing a direct answers, but help them get closer to the answer step-by-step. Do not answer everything in just one answer. Always remember the original prompt to provide
correct guidance.""",
),
("human", "{question}"),
]
)
chain = prompt_template | model | StrOutputParser()
chain_with_history = RunnableWithMessageHistory(chain, get_or_create_session_history)
cl.user_session.set("chat_chain", chain_with_history)
@cl.on_message
async def on_message(user_message: cl.Message):
"""Handle user messages."""
chat_chain = cl.user_session.get("chat_chain")
response_message = cl.Message(content="") # Initialize an empty response message
async for response_chunk in chat_chain.astream(
{"question": user_message.content}, # Pass user question
config={"configurable": {"session_id": "session1"}}, # Include the config with session_id
):
await response_message.stream_token(response_chunk) # Stream response
await response_message.send() # Send the final responseProcess
Final Deliverable(s)
Here is a video of how the code works: