- AI Valley
- Posts
- Snapchat introduced AI-powered glasses
Snapchat introduced AI-powered glasses
PLUS: Meta Connect 2024: What to expect?
Together with
Howdy! It’s Barsee again. Happy Thursday!
In today’s edition:
👓️ Snap introduced AI-powered glasses
🌐 Meta Connect 2024: What to expect?
🫦 Hume AI Unveils Ultra-Realistic Conversational Model
🤖 Plus trending AI tools, guides, and resources.
Ready, set, go…
TOGETHER WITH GROWTHSCHOOL
Still struggling to achieve your 2024 Goals of multiplying your income, achieving work-life balance, and managing your time efficiently?
Join this 3-hour intensive workshop on AI & ChatGPT tools (usually $399) but FREE for the first 100 readers.
Save your free spot here (seats are filling fast!)
Save up 3 hours of your time to learn AI strategies & hacks that less than 1% of people know!
🗓️ Tomorrow | ⏱️ 10 AM EST
In this workshop, you will learn how to:
Make smarter decisions based on data in seconds using AI
Automate daily tasks and increase productivity & creativity
Skyrocket your business growth by leveraging the power of AI
Save 1000s of dollars by using ChatGPT to simplify complex problems
SNAPCHAT
👓️ Snap introduced AI-powered glasses
Snap recently unveiled its fifth-generation Spectacles, standalone AR glasses powered by the new Snap OS. These glasses integrate AI capabilities to enhance social interactions through augmented reality.
How it works?
The Spectacles have four tracking cameras for spatial awareness and hand-tracking sensors, enabling multi-modal AI to overlay AR content in the real world.
Powered by the new Snap OS, they can be controlled through hand gestures and voice commands. Menus appear in the user's palm, allowing interaction with virtual buttons, objects, and environments.
What’s next?
Snap has also announced its partnership with OpenAI. This collaboration will allow developers to incorporate advanced AI models into Spectacles apps, potentially enhancing the contextual understanding of the wearer's environment.
How do I access it?
Snap initially focused on developers rather than consumers. The company is selling Spectacles through its developer program, priced at $99 per month with a 12-month minimum commitment.
PS: Here is a quick demo.
META
🌐 Meta Connect 2024: What to expect?
The two-day event, which kicks off on September 25, will highlight the company’s latest hardware and software innovations. Here’s what to expect from Meta Connect 2024:
The event will discuss Meta’s latest offerings, like its AR/VR headsets, smart glasses and wearables, and AI divisions, in a keynote presented by its CEO, Mark Zuckerberg.
Meta will likely unveil Orion - its next-generation augmented reality glasses that can layer holographic imagery on top of reality.
Meta is also rumored to be releasing new Ray-Ban smart glasses with a built-in screen, camera, speaker, and microphone.
It might also reveal a new version of the Quest 3 headset, Quest 3S. Meta is trying to make them cost-effective, hinting that it could probably replace the Quest 2.
Why does it matter?
With major players like Apple having recently launched the Vision Pro, Meta's unveiling of the Orion AR glasses and a more affordable Quest 3S headset could intensify competition. It would be interesting to see if Meta surpasses Apple in user experience and functionality.
VOICE TO VOICE AI
🫦 Hume AI Unveils Ultra-Realistic Conversational Model
Hume AI recently introduced EVI 2, a new voice-to-voice model that mimics highly human-like conversations with natural tones and varied styles, focused on emotional intelligence.
What does it do?
EVI 2 can mimic a variety of personalities, accents, and supports multiple languages.
The model can generate a variety of voice tones and even meet specific needs, such as adjusting speaking speed, imitating rap, and other personalized requests.
It can understand the user’s tone of voice and adjust its responses based on the emotional state of the conversation.
Importantly, it prevents direct voice cloning, addressing unique risks associated with this capability. Instead, it allows users to create synthetic voices by adjusting base voices in gender, nasality, and pitch.
How to use EVI 2?
You can interact directly with EVI 2 through the Hume AI platform and experience its real-time voice conversation function.
QUICK HITS
There is a new Twitter AI Clone called SocialAI. Imagine X/Twitter, but you have millions of followers. Except the catch is they are all AI. You have 0 human followers. (link)
Google’s new robots, Aloha Unleashed and DemoStart, demonstrated impressive dexterity, performing tasks like tying a shoelace, hanging a shirt, and cleaning a kitchen. (link)
This brain implant lets people control Amazon Alexa with their minds. (link)
Microsoft and BlackRock formed a group to raise $100 billion to invest in AI infrastructure. (link)
Salesforce released Agentforce, a suite of low-code tools to build autonomous AI agents that can perform reasoning for sales, marketing, and commerce-related tasks. (link)
Perplexity added a new "reasoning" focus feature for Pro users, utilizing OpenAI’s o1-mini model for puzzles, math, and coding tasks. (link)
Slack unveils an AI-powered note-taking tool capable of summarizing meetings with AI or Google Meet. (link)
USEFUL AI LINKS
Trending Tools
CuriousThing > AI to replace voicemail and detect robocalls. (link)
BoltAI > All AI models in one app. (link)
Flair AI > New AI product video commercials. (link)
SocialAI > Your personal AI-powered social network. (link)
Compute > Agent-driven research engine for power AI users. (link)
Way Faster > Interview your entire pipeline 10x faster using voice AI. (link)
Resources / Guides
Two new resources for learning prompt engineering by Anthropic AI. (link)
AI TRAINING
How to create custom AI chatbots (Gems) in Google Gemini
Here's the step-by-step process:
Go to the Gemini website.
Click on "Gem Manager" in the bottom left menu, then select "New Gem."
A new window will open. Name your Gem with something descriptive so you can easily find it later.
Next, give your Gem instructions directly, such as “you are an expert on travel destinations.” Specify its purpose, personality, and preferred language.
Be sure to test your Gem using the prompt box on the right to ensure it responds correctly and in the style you want.
You can also use standard features like Google search and draft views. Your Gem will provide references to web pages for its information.
When satisfied, click "Save" to add your Gem to the list. You can access it anytime from the left pane or Gem Manager.
DAILY DOSE OF CONTENTS
1/ A humanoid robot, Nadia, is remotely controlled for boxing training using a simple VR motion capture setup.
2/ OpenAI CEO Sam Altman confirms that Level-3 Agents are coming soon.
🚨 OpenAI CEO Sam Altman confirms that Level-3 Agents are coming soon
" The shift to level 2 took time, but it accelerates the development of level 3.
This will enable impactful agent-based experiences that will greatly impact technology advancements in technology " x.com/i/web/status/1…— Haider. (@slow_developer)
6:03 AM • Sep 19, 2024
3/ BMW color-changing car.
4/ Halidecamera recorded an iPhone’s front-facing TrueDepth camera doing Face ID with an infrared-sensitive camera. See the flashes, its insane.
We recorded an iPhone’s front facing TrueDepth camera doing Face ID with an infrared-senstive camera:
— Halide + Kino (@halidecamera)
12:05 AM • Sep 18, 2024
THAT’S ALL FOR TODAY
That’s all for today’s issue, folks.
💡 Help me get better and suggest new ideas at [email protected] or @heyBarsee
👍️ Like what you see? Subscribe here
Thanks for being here.
HOW WAS TODAY'S NEWSLETTER |
REACH 100K+ READERS
Acquire new customers and drive revenue by partnering with us
Sponsor AI Valley and reach over 100,000+ entrepreneurs, founders, software engineers, investors, etc.
If you’re interested in sponsoring us, email [email protected] with the subject “AI Valley Ads”.