- Two free ways to get a Perplexity Pro subscription for one year
- The 40+ best Black Friday PlayStation 5 deals 2024: Deals available now
- The 25+ best Black Friday Nintendo Switch deals 2024
- Why there could be a new AI chatbot champ by the time you read this
- The 70+ best Black Friday TV deals 2024: Save up to $2,000
Streamline the Development of Real-Time AI Applications with MindsDB Docker Extension | Docker
This post was contributed by Martyna Slawinska, Software Engineer at MindsDB, in collaboration with Ajeet Singh, Developer Advocate at Docker.
AI technology has seen several challenges that undoubtedly hinder its progress. Building an AI-powered application requires significant resources, including qualified professionals, cost, and time. Prominent obstacles include:
- Bringing (real-time) data to AI models through data pipelines is complex and requires constant maintenance.
- Testing different AI/ML frameworks requires dedicated setups.
- Customizing AI with dynamic data and making the AI system improve itself automatically sounds like a major undertaking.
These difficulties make AI systems scarcely attainable for small and large enterprises alike. The MindsDB platform, however, helps solve these challenges, and it’s now available in the Extensions Marketplace of Docker Desktop.
In this article, we’ll show how MindsDB can streamline the development of AI-powered applications and how easily you can set it up via the Docker Desktop Extension.
How does MindsDB facilitate the development of AI-powered apps?
MindsDB is a platform for customizing AI from dynamic data. With its nearly 200 integrations to data sources and AI/ML frameworks, any developer can use their own data to customize AI for their purposes, faster and more securely.
Let’s solve the problems as defined one by one:
- MindsDB integrates with numerous data sources, including databases, vector stores, and applications. To make your data accessible to many popular AI/ML frameworks, all you have to do is execute a single statement to connect your data to MindsDB.
- MindsDB integrates with popular AI/ML frameworks, including LLMs and AutoML. So once you connect your data to MindsDB, you can pass it to different models to pick the best one for your use case and deploy it within MindsDB.
- With MindsDB, you can manage models and data seamlessly, implement custom automation flows, and make your AI systems improve themselves with continuous finetuning.
With MindsDB, you can build AI-powered applications easily, even with no AI/ML experience. You can interact with MindsDB through SQL, MongoDB-QL, REST APIs, Python, and JavaScript.
Follow along to learn how to set up MindsDB in Docker Desktop.
How does MindsDB work?
With MindsDB, you can connect your data from a database, a vector store, or an application, to various AI/ML models, including LLMs and AutoML models (Figure 1). By doing so, MindsDB brings data and AI together, enabling the intuitive implementation of customized AI systems.
MindsDB enables you to easily create and automate AI-powered applications. You can deploy, serve, and fine-tune models in real-time, utilizing data from databases, vector stores, or applications, to build AI-powered apps — using universal tools developers already know.
Find out more about MindsDB and its features, as well as use cases, on the MindsDB website.
Why run MindsDB as a Docker Desktop Extension?
MindsDB can be easily installed on your machine via Docker Desktop. MindsDB provides a Docker Desktop Extension, which lets you use MindsDB within the Docker Desktop environment.
As MindsDB integrates with numerous data sources and AI frameworks, each integration requires a specific set of dependencies. With MindsDB running in Docker Desktop, you can easily install only the required dependencies to keep the image lightweight and less prone to issues.
Running MindsDB as a Docker Desktop Extension gives you the flexibility to:
- Set up your MindsDB environment easily by installing the extension.
- Customize your MindsDB environment by installing only the required dependencies.
- Monitor your MindsDB environment via the logs accessible through the Docker Desktop.
Next, we’ll walk through setting up MindsDB in Docker Desktop. For more information, refer to the documentation.
Getting started
MindsDB setup in Docker Desktop
To get started, you’ll need to download and set up Docker Desktop on your computer. Then, follow the steps below to install MindsDB in Docker Desktop:
First, go to the Extensions page in Docker Desktop, search for MindsDB, and install the MindsDB extension (Figure 2).
Then, access MindsDB inside Docker Desktop (Figure 3).
This setup of MindsDB uses the mindsdb/mindsdb:latest
Docker image, which is a lightweight Docker image of MindsDB that comes with these integrations preloaded.
Now that you installed MindsDB in Docker Desktop, think of a use case you want to run and list all integrations you want to use. For example, if you want to use data from your PostgreSQL database and one of the models from Anthropic to analyze your data, then you need to install dependencies for Anthropic (as dependencies for PostgreSQL are installed by default).
You can find more use cases on the MindsDB website.
Here is how to install dependencies (Figure 4):
- In the MindsDB editor, go to Settings and Manage Integrations.
- Select the integrations you want to use and choose Install.
We customized the MindsDB image by installing only the required dependencies. Visit the documentation to learn more.
AI Agents deployment with MindsDB
In this section, we’ll showcase the AI Agents feature developed by MindsDB. AI Agents come with an underlying large language model and a set of skills to answer questions about your data stored in databases, files, or websites (Figure 5).
Agents require a model in the conversational mode. Currently, MindsDB supports the usage of models via the LangChain handler.
There are two types of skills, as follows:
- The Text-to-SQL skill translates questions asked in natural language into SQL code to fetch correct data and answer the question.
- The Knowledge Base skill stores and searches data assigned to it utilizing embedding models and vector stores.
Let’s get started.
Step 1. Connect your data source to MindsDB.
Here, we use our sample PostgreSQL database and connect it to MindsDB:
CREATE DATABASE example_db
WITH ENGINE = "postgres",
PARAMETERS = {
"user": "demo_user",
"password": "demo_password",
"host": "samples.mindsdb.com",
"port": "5432",
"database": "demo",
"schema": "demo_data"
};
Let’s preview the table of interest:
SELECT *
FROM example_db.car_sales;
This table stores details of cars sold in recent years. This data will be used to create a skill in the next step.
Step 2. Create a skill.
Here, we create a Text-to-SQL skill using data from the car_sales
table:
CREATE SKILL my_skill
USING
type = 'text_to_sql',
database = 'example_db',
tables = ['car_sales'],
description = 'car sales data of different car types';
The skill description should be accurate because the model uses it to decide which skill to choose to answer a given question. This skill is one of the components of an agent.
Step 3. Create a conversational model.
AI Agents also require a model in the conversational model. Currently, MindsDB supports the usage of models via the LangChain handler.
Note that if you choose one of the OpenAI models, the following configuration of an engine is required:
CREATE ML_ENGINE langchain_engine
FROM langchain
USING
openai_api_key = 'your-openai-api-key';
Now you can create a model using this engine:
CREATE MODEL my_conv_model
PREDICT answer
USING
engine = 'langchain_engine',
input_column = 'question',
model_name = 'gpt-4',
mode = 'conversational',
user_column = 'question' ,
assistant_column = 'answer',
max_tokens = 100,
temperature = 0,
verbose = True,
prompt_template = 'Answer the user input in a helpful way';
You can adjust the parameter values, such as prompt_template
, to fit your use case. This model is one of the components of an agent.
Step 4. Create an agent.
Now that we have a skill and a conversational model, let’s create an AI Agent:
CREATE AGENT my_agent
USING
model = 'my_conv_model',
skills = ['my_skill'];
You can query this agent directly to get answers about data from the car_sales
table that has been assigned to the skill (my_skill
) that in turn has been assigned to an agent (my_agent
).
Let’s ask some questions:
SELECT *
FROM my_agent
WHERE question = 'what is the most commonly sold model?';
Figure 6 shows the output generated by the agent:
Furthermore, you can connect this agent to a chat app, like Slack, using the chatbot object.
Conclusion
MindsDB streamlines data and AI integration for developers, offering seamless connections with various data sources and AI frameworks, enabling users to customize AI workflows and obtain predictions for their data in real time.
Leveraging Docker Desktop not only simplifies dependency management for MindsDB deployment but also provides broader benefits for developers by ensuring consistent environments across different systems and minimizing setup complexities.