|
Welcome to your weekly AI Newsletter from AITechCircle!
This newsletter has become an essential resource for me and many others in the AI community. It has practical insights that will immediately boost your work or business.
Dive into this week’s updates, and take a moment to share them with a friend or colleague who could gain from these valuable insights! AITechCircle
Today at a Glance:
- Build Phase of a Gen AI Implementation Journey
- Chief AI Officers Corner with 90 days plan
- AI Weekly news and updates covering newly released LLMs
- Courses and events to attend
|
|
|
Build Phase: Creating Custom Gen AI Models for Enterprise Success
In the final stage of the AI adoption journey, organizations transition from leveraging pre-built solutions to taking complete control by building custom Generative AI models. In this Build phase, or the “Maker” stage, organizations embed AI as a core asset tailored precisely to their business needs and strategic goals. This step involves developing advanced models and aligning AI tightly with business objectives, maximizing its value, and ensuring robust governance.
During the last two weeks, we covered how to take up Generative AI in organizations with three distinct phases, and these three stages are categorized for the AI Implementation Journey. :
- Adopt (Takers): Leverage AI features embedded in existing SaaS applications, focusing on ease and low costs. Starting the AI Implementation Journey - Adopt
- Buy (Shapers): Expand AI capabilities using a Generative AI portfolio, aiming to extend functionalities. PaaS for SaaS to LLMs for SaaS
- Build (Makers): Develop and train custom Generative AI models, focusing on business value and governance
Build (Makers)
The build phase represents a shift toward deeper integration and innovation as businesses create Gen AI models specific to their operations. Unlike previous phases, Adopt and Buy, this stage requires a more intensive approach, calling for custom model development, sophisticated data pipelines, and an environment ready for experimentation.
It demands strong collaboration between data science teams, IT, and business leaders and significant infrastructure, talent, and time investments.
Here’s a quick comparison of the three primary approaches, training from scratch, fine-tuning, and Retrieval-Augmented Generation (RAG), to help you choose the best path for embedding your organization’s unique domain knowledge into a Large Language Model (LLM).
This comparison outlines the critical aspects of each approach, assisting in determining the most suitable method based on specific requirements, resources, and objectives. In the “Build” phase of AI development, Oracle Cloud Infrastructure (OCI) offers a comprehensive suite of services and tools to facilitate the creation, customization, and deployment of advanced Generative AI solutions.
Could you share your feedback on which approach you are currently exploring for your organization?
|
|
|
Weekly News & Updates...
Last week's AI breakthroughs marked another leap forward in the tech revolution.
- You can build with the Grok using the xAI API released, and it offers a 128k context-length link
- Physical Intelligence (π), a VLA generalist, performs dexterous tasks (laundry folding, table bussing, and many others). It's pre-trained on a large π dataset spanning many form factors. link
- CogVideoX1.5 model. CogVideoX1.5 is an upgraded version of the open-source model CogVideoX. The CogVideoX1.5-5B series supports 10-second videos with higher resolution, and CogVideoX1.5-5B-I2V supports video generation at any resolution link
The Cloud: the backbone of the AI revolution
- RAG to reality: Amplify AI and cut cost; this article offers two approaches: link
- Give AI a Look: Any Industry Can Now Search and Summarize Vast Volumes of Visual Data, link
|
|
|
Chief AI Officer (CAIO) Corner:
Last week, I had an exciting meeting with one of the newly appointed Chief AI Officers (CAIO) in Dubai. We discussed the best ways to introduce AI in the organization and where to start.
This discussion sparked the idea for this article. If someone is entrusted with this role or responsibility to oversee the enterprise AI initiative, how should they take it up? I just realized that if I need to start this, what could my actions be?
|
|
|
Favorite Tip Of The Week:
Here's my favorite resource of the week.
Scaling AI: Strategies for AI-Steady and AI-Accelerated Organizations from Gartner. Link
"What is scaling AI? Scaling AI - and generative AI - across the enterprise involves such steps as establishing a continuous process to prioritize use cases, creating a decision framework for build versus buy, piloting use cases for scalability, putting responsible AI at the forefront and investing in data and AI literacy."
|
|
|
Things to Know...
Generative AI Framework for HMG from the UK Center Digital and Data office has outlined 10 principles for the safe, responsible, and effective use of generative AI in government organizations. link
- Principle 1: You know what generative AI is and what its limitations are
- Principle 2: You use generative AI lawfully, ethically and responsibly
- Principle 3: You know how to keep generative AI tools secure
- Principle 4: You have meaningful human control at the right stage
- Principle 5: You understand how to manage the full generative AI lifecycle
- Principle 6: You use the right tool for the job
- Principle 7: You are open and collaborative
- Principle 8: You work with commercial colleagues from the start
- Principle 9: You have the skills and expertise needed to build and use generative AI
- Principle 10: You use these principles alongside your organisation’s policies and have the right assurance in place
|
|
|
The Opportunity...
Podcast:
- This week's Open Tech Talks episode 148 is "Strategic AI Adoption for Businesses with Nick Jain." The CEO, Idea Scale
Apple | Spotify | Amazon Music
Courses to attend:
Events:
|
|
|
Other Technology News
Want to stay updated on the latest information in the field of Information Technology? Here's what you should know:
- Behind The Scenes: Nvidia’s Great Rise And The New Data Center Era, story published by Forbes
- Small Language Models Gaining Popularity While LLMs Still Go Strong, reported by Forbes
|
|
|
Earlier week's Post:
And that’s a wrap!
Thank you, as always, for taking the time to read.
I’d love to hear your thoughts. Hit reply and let me know what you find most valuable this week! Your feedback means a lot.
Until next week,
Kashif Manzoor
|
|
|
The opinions expressed here are solely my conjecture based on experience, practice, and observation. They do not represent the thoughts, intentions, plans, or strategies of my current or previous employers or their clients/customers. The objective of this newsletter is to share and learn with the community. |
|
|
Dubai, UAE
You are receiving this because you signed up for the AI Tech Circle newsletter or Open Tech Talks. If you'd like to stop receiving all emails, click here. Unsubscribe · Preferences
|
|
|