Google Gemini AI gets “Personal Intelligence” — and your Gmail and Photos are in play (if you opt in)

January 15, 2026
Google Gemini AI gets “Personal Intelligence” — and your Gmail and Photos are in play (if you opt in)

MOUNTAIN VIEW, California, January 15, 2026, 08:30 PST

  • Google rolled out a beta “Personal Intelligence” feature for the Gemini app that can draw on Gmail, Photos, Search and YouTube history.
  • The company said access is starting in the U.S. for eligible Google AI Pro and AI Ultra subscribers, with the feature off by default.
  • The update pushes Gemini toward more personalized, proactive answers — and puts fresh weight on privacy controls and user trust.

Google on Wednesday began rolling out “Personal Intelligence,” a beta feature for its Gemini AI that links to a user’s Gmail, Google Photos, Search and YouTube history to tailor responses, the company said. It said the feature is off by default and will roll out over the next week to eligible Google AI Pro and AI Ultra subscribers in the United States, before expanding more broadly. (Blog)

Why this matters now: the race is shifting from who has the best model to who can plug that model into a person’s daily life without spooking them. “Personal” is the selling point — it’s also the risk.

Google is trying to turn Gemini from a chatbot that answers questions into a system that can connect the dots across what people already store inside Google. That’s a direct shot at rivals like OpenAI and Anthropic, which can be strong on raw model output but don’t sit on the same pile of consumer data.

In plain terms, Google is letting Gemini pull details from different places at once — an email, a photo, a watch history entry — to answer one prompt. Josh Woodward, a vice president overseeing the Gemini app, wrote that the feature’s “two core strengths” are “reasoning across complex sources” and pulling “specific details” from personal content, and TechCrunch reported Gemini can surface proactive links across that data without being told where to look. (TechCrunch)

Google has been showcasing the “tire shop” scenario: Gemini finds a vehicle detail and then layers in personal context, like trip photos, to narrow down choices. 9to5Google also reported Gemini will aim to show where it got key information from connected sources, a move meant to make the answers easier to check. (9to5Google)

The privacy fine print is where this gets messy. A Google help page says connected-app data can be used to personalize Gemini and “improve Google services,” including training generative AI models, depending on settings, and it notes that some processed data may be reviewed by human reviewers. It also says Gemini does not train models “directly” on an entire Gmail inbox or the full Photos library, but may train on summaries, excerpts and inferences created to answer prompts. (Google Help)

But the system can still get it wrong — and Google is warning people not to over-trust it. The Verge reported Woodward cautioned about “over-personalization,” where the model draws links between unrelated topics, and flagged timing and nuance as weak spots, including around relationship changes. The Verge also reported the feature is powered by Google’s Gemini 3 models and is opt-in, with users choosing which apps to connect. (The Verge)

The push to embed Gemini deeper into consumer products comes as Google widens distribution through partners, too. Apple will use Google’s Gemini models for a revamped Siri later in 2026 under a multi-year deal, Reuters reported, a move that would broaden Gemini’s reach across Apple’s device base; analyst Parth Talsania of Equisights Research said it shifts OpenAI into a more supporting role, while Tesla CEO Elon Musk criticized the arrangement as an “unreasonable concentration of power.” (Reuters)

Google is also threading Gemini into smaller product upgrades that can change daily workflows in quieter ways. TechCrunch reported the company is revamping the Google Trends Explore page with Gemini-powered features that automatically identify and compare related trends, rolling out on desktop. (TechCrunch)

What happens next is less about demos and more about habits: whether people turn the feature on, and whether the “personal” layer feels useful or intrusive. The other variable is trust — a single bad privacy headline, or a few high-profile wrong answers drawn from personal data, could slow adoption fast.

Ultimate Gemini 3.0 Pro Guide 2026: How to Use Google AI For Beginners

Technology News

  • Five camera settings I change on every new phone
    January 15, 2026, 12:00 PM EST. Smartphones ship with camera defaults tuned for safety rather than professional use, notes the author. The piece argues you can improve everyday shots by tweaking a few settings in normal mode rather than diving into Pro mode. Foremost is anti-banding: indoor LEDs can flicker at 50Hz or 60Hz, producing visible lines in photos and video. By using Auto anti-banding, the camera analyzes the lighting and selects the right filter on the fly-a practical, set-and-forget fix. The article also hints at a dirty lens warning that helps prevent ruined shots by reminding users to clean the lens. Taken together, these tweaks offer tangible gains without mastering complex controls, aligning hardware potential with real-world use.