The AI continuum

ChatGPT has turned everything we know about AI on its head. Or has it?

AI encompasses many things. Generative AI and large language models (LLMs) like ChatGPT are only one aspect of AI. But it’s the well-known part of AI. In many ways, ChatGPT put AI in the spotlight, creating a widespread awareness of AI as a whole—and helping to spur the pace of its adoption. 

You probably know that ChatGPT wasn’t built overnight. It’s the culmination of a decade of work on deep learning AI. That decade has given us newfound ways to use AI—from apps that know what you’ll type next, to cars that drive themselves and algorithms for scientific breakthroughs.

AI’s broad applicability and the popularity of LLMs like ChatGPT have IT leaders asking: Which AI innovations can deliver business value to our organization without devouring my entire technology budget? Here is some guidance.

AI Options

From a high-level standpoint, here are the AI options:

Generative AI: The state of the art
 Current generative AI leaders, OpenAI ChatGPT, Meta Llama2, and Adobe Firefly, use LLMs to produce immediate value for knowledge workers, creatives, and business operations. 
Model sizes: ~5 billion to >1 trillion parameters.
Great for:       Turning prompts into new material. 
Downsides:    Can hallucinate, fabricate and produce unpredictable results.
Deep learning AI: A rising workhorse
 Deep learning AI uses the same neural network architecture as generative AI, but can’t understand context, write poems or create drawings. It provides smart applications for translation, speech-to-text, cybersecurity monitoring and automation. 
Model sizes: ~Millions to billions of parameters.
Great for:       Extracting meaning from unstructured data like network traffic, video & speech.
Downsides:    Not generative; model behavior can be a black box; results can be challenging to explain.
Classical machine learning: Patterns, predictions, and decisions
 Classical machine learning is the proven backbone of pattern recognition, business intelligence, and rules-based decision-making; it produces explainable results. 
Model sizes: Uses algorithmic and statistical methods rather than neural network models.
Great for:       Classification, identifying patterns, and predicting results from smaller datasets.
Downsides:    Lower accuracy; the source of dumb chatbots; not suited for unstructured data.

5 ways to put LLMs and deep learning AI to work

While LLMs are making headlines, every flavor of AI—generative AI, standard deep learning, and classical machine learning—has value. How you use AI will vary based on the nature of your business, what you produce, and the value you can create with AI technologies. 

Here are five ways to put AI to work, ranked from easiest to most difficult. 

1. Use the AI that comes with the applications you already have

Business and enterprise software providers like Adobe, Salesforce, Microsoft, Autodesk, and SAP are integrating multiple types of AI into their applications. The price-performance value of consuming AI via the tools you already use is hard to beat.

2. Consume AI as a service 

AI-as-a-Service platforms are growing exponentially. There are generative AI assistants for coders, highly specialized AI for specific industries, and deep learning models for discrete tasks. Pay-as-you-go offerings provide the convenience of a turnkey solution that can scale rapidly.

3. Build a custom workflow with an API

With an application programming interface (API), applications and workflows can tap into world-class generative AI. APIs make it easy for you to extend AI services internally or to your customers through your products and services. 

4. Retrain and fine-tune an existing model

Retraining proprietary or open-source models on specific datasets creates smaller, more refined models that can produce accurate results with lower-cost cloud instances or local hardware. 

5. Train a model from scratch

Training your own LLM is out of reach for most organizations, and it still may not be a wise investment. Training a GPT4-scale, trillion-parameter model takes billions of dollars in supercomputing hardware, months of time, and valuable data science talent. Fortunately, most organizations can build on publicly available proprietary or open-source models.

What’s the right infrastructure for AI?

The right infrastructure for AI depends on many factors–the type of AI, the application, and how it is consumed. Matching AI workloads with hardware and using fit-for-purpose models improves efficiency, increases cost-effectiveness, and reduces computing power. 

From a processor performance standpoint, it’s about delivering seamless user experiences. That means producing tokens within 100 milliseconds or faster or ~450 words per minute; if results take longer than 100 milliseconds, users notice lag. Using this metric as a benchmark, many near-real-time situations may not require unique hardware.For example, a major cybersecurity provider developed a deep learning model to detect computer viruses. Financially, it was impractical to deploy the model on GPU-based cloud infrastructure. Once engineers optimized the model for the built-in AI accelerators on Intel® Xeon® processors, they could scale the service to every firewall the company secures using less-expensive cloud instances.1

Tips for putting AI to work

Generative AI is a once-in-a-generation disruption on par with the internet, the telephone, and electricity—except it’s moving much faster. Organizations of every size have to put AI to work as effectively and efficiently as possible, but that doesn’t always mean huge capital investments in AI supercomputing hardware. 

  • Pick the right AI for your needs. Don’t use generative AI for a problem that classical machine learning has already solved.
  • Match models to specific applications. Retraining, refining, and optimizing create efficiency so you can run on less expensive hardware.
  • Use compute resources wisely. Whether you run in the public cloud or on-premises, keep efficiency top of mind.
  • Start small and notch wins. You’ll learn how to use AI effectively, begin shifting your culture, and build momentum.

Most importantly, remember you’re not alone on this journey. Open-source communities and companies like Dell and Intel are here to help you weave AI throughout your enterprise. 

About Intel

Intel hardware and software are accelerating AI everywhere. Intel solutions power AI training, inference, and applications in everything from Dell supercomputers and data centers to rugged Dell edge servers for networking and IoT. Learn more

About Dell

Dell Technologies accelerates your AI journey from possible to proven by leveraging innovative technologies, a comprehensive suite of professional services, and an extensive network of partners. Learn more

[1] Intel, Palo Alto Networks Automates Cybersecurity with Machine Learning, Feb 28, 2023, accessed December 2023

Artificial Intelligence



Source link