Langchain, Ollama, and Llama 3 prompt and response

Written by - Aionlinecourse919 times views

Langchain, Ollama, and Llama 3 prompt and response

Langchain, Ollama, and Llama 3 are powerful frameworks that are commonly used by researchers and developers in the field of Machine Learning and Natural Language Processing. These frameworks assist in streamlining the process of managing large datasets, the development of advanced conversational agents, and enhancing the overall user experience by more accurate and context-aware responses. Therefore, we should have a deep understanding of how to craft prompts and interpret responses correctly to leverage these tools effectively. 

LangChain

LangChain is a powerful open-source framework that helps to build advanced NLP applications with large language models(LLMs).  LangChain’s module-based approach allows developers and data scientists to dynamically compare different prompts and even different foundation models with minimal need to rewrite code. This modular environment also allows for programs that use multiple LLMs.

Ollama

Ollama is another powerful framework for managing large datasets and automating complex workflows of Machine Learning and NLP applications. Ollama delivers unmatched customization and efficiency for natural language processing tasks by enabling the execution of open-source language models locally. Its ability to process natural language inputs makes it a perfect fit for a variety of applications.

Llama 3

Like LangChain and Ollama, Llama 3 is another popular powerful framework for its capabilities of handling a more extensive array of tasks, including text, image, and video processing. For its exceptional performance in generating human-like text, Llama 3 is the latest iteration in a series of advanced language models. This open-source framework provides detailed and context-rich prompts.

Solution 1:

Using a PromptTemplate from Langchain, and setting a stop token for the model, I was able to get a single correct response.

from langchain_community.llms import Ollama
from langchain import PromptTemplate # Added

llm = Ollama(model="llama3", stop=["<|eot_id|>"]) # Added stop token

def get_model_response(user_prompt, system_prompt):
    # NOTE: No f string and no whitespace in curly braces
    template = """
        <|begin_of_text|>
        <|start_header_id|>system<|end_header_id|>
        {system_prompt}
        <|eot_id|>
        <|start_header_id|>user<|end_header_id|>
        {user_prompt}
        <|eot_id|>
        <|start_header_id|>assistant<|end_header_id|>
        """

    # Added prompt template
    prompt = PromptTemplate(
        input_variables=["system_prompt", "user_prompt"],
        template=template
    )
    
    # Modified invoking the model
    response = llm(prompt.format(system_prompt=system_prompt, user_prompt=user_prompt))
    
    return response

Solution 2:

Here is a sample code to work with Langchain and LlamaCpp with a local model file. Library insists on using the invoke method rather than directly calling "llm(message)"

from langchain_community.llms import LlamaCpp
from langchain_core.prompts import PromptTemplate

llm = LlamaCpp(
    model_path = "C:\\Users\\LENOVO\\models\\QuantFactory\\Meta-Llama-3-8B-Instruct-GGUF\\Meta-Llama-3-8B-Instruct.Q3_K_L.gguf",
    n_gpu_layers=-1,
    temperature=0,
    stop=["<|eot_id|>"],
)

template = """
        <|begin_of_text|>
        <|start_header_id|>system<|end_header_id|>
        {system_prompt}
        <|eot_id|>
        <|start_header_id|>user<|end_header_id|>
        {user_prompt}
        <|eot_id|>
        <|start_header_id|>assistant<|end_header_id|>
        """
def get_response(country):
    sys_template_str = "Give a one or two word answers only."
    human_template_str = "What is capital of {country} ?"

    prompt = PromptTemplate.from_template(template.format(system_prompt = sys_template_str,
                                                          user_prompt = human_template_str))
    session = prompt | llm 
    response = session.invoke({"country":country})
    print(response)

get_response("Australia")

You will be able to generate responses and prompts for Langchain, Ollama, and Llama 3 by following the above steps. Crafting detailed prompts and interpreting responses for LangChain, Ollama, and Llama 3 can significantly enhance the NLP applications. We should Keep experimenting, refining, and leveraging feedback to improve prompts and responses for our projects or applications continuously.

Thank you for reading the article.

Recommended Projects

Deep Learning Interview Guide

Topic modeling using K-means clustering to group customer reviews

Have you ever thought about the ways one can analyze a review to extract all the misleading or useful information?...

Natural Language Processing
Deep Learning Interview Guide

Automatic Eye Cataract Detection Using YOLOv8

Cataracts are a leading cause of vision impairment worldwide, affecting millions of people every year. Early detection and timely intervention...

Computer Vision
Deep Learning Interview Guide

Medical Image Segmentation With UNET

Have you ever thought about how doctors are so precise in diagnosing any conditions based on medical images? Quite simply,...

Computer Vision
Deep Learning Interview Guide

Build A Book Recommender System With TF-IDF And Clustering(Python)

Have you ever thought about the reasons behind the segregation and recommendation of books with similarities? This project is aimed...

Machine LearningDeep LearningNatural Language Processing
Deep Learning Interview Guide

Build Regression Models in Python for House Price Prediction

Ever wondered how experts predict house prices? This project dives into exactly that! Using Python, we'll build regression models that...

Machine Learning
Deep Learning Interview Guide

Optimizing Chunk Sizes for Efficient and Accurate Document Retrieval Using HyDE Evaluation

This project demonstrates the integration of generative AI techniques with efficient document retrieval by leveraging GPT-4 and vector indexing. It...

Natural Language ProcessingGenerative AI
Deep Learning Interview Guide

Crop Disease Detection Using YOLOv8

In this project, we are utilizing AI for a noble objective, which is crop disease detection. Well, you're here if...

Computer Vision