Fotografía

Vivo X300 Pro’s 200MP Zeiss Camera & Snap-On Lens Aim to Replace Your DSLR – Vivo X300 Series Launch Shakes Up Mobile Photography

Aparat 200 MP Zeiss i obiektyw typu snap-on w Vivo X300 Pro mają zastąpić lustrzanki – Premiera serii Vivo X300 rewolucjonizuje fotografię mobilną

Seria Vivo X300 wprowadza innowacje w aparatach na poziomie lustrzanek Seria X300 firmy Vivo oficjalnie zadebiutowała w Chinach – i robi furorę, wynosząc fotografię smartfonową na nowy poziom. Linia, zaprezentowana podczas prestiżowego wydarzenia w Szanghaju 13 października, obejmuje standardowy model Vivo X300
21 października, 2025

Technology News

  • Trump-era policy shift slows US EV investment plans
    October 21, 2025, 12:58 PM EDT. After years of policy-driven incentives, a Trump-era shift, with tariffs and relaxed emissions rules, is slowing US EV investment. Major automakers such as Ford, GM, and BMW are delaying or downsizing battery plant investments as the expected payoffs from the Inflation Reduction Act and other Biden-era incentives recede. GM now plans three Ultium-brand battery plants instead of four, while LG has postponed a $5.5B Arizona battery plant to H1 2026 and faces supply issues for Tesla and Rivian. The broader effect is a reduced EV production pipeline and a reconfiguration of the US domestic EV supply chain as tariffs reshape where cars are built.
  • How the Android-ChromeOS merger could explain the Pixel Tablet 2 cancellation
    October 21, 2025, 12:56 PM EDT. Google's looming Android-ChromeOS convergence reshapes its hardware plans. The Pixel Tablet 2 cancellation appears less about profitability and more about a broader pivot to a converged OS designed around a flagship device. The emergence of Sapphire-a development board based on the Tanjiro design for devices powered by the MediaTek Kompanio Ultra-signals a new era. The LED strip and Google-like cues make Sapphire a strong candidate for a Google-made Pixelbook Tablet to anchor the converged OS. If the merger was already underway when the Pixel Tablet 2 was paused, launching a standard Android tablet would have muddied the transition. In short, the decision may be about clearing the field for a landmark platform shift rather than simple profitability.
  • Apple alerts exploit developer that his iPhone was targeted with government spyware
    October 21, 2025, 12:54 PM EDT. Apple issued a notification to Jay Gibson, a former Trenchant developer, saying his iPhone was hit by a targeted government spyware attack. Gibson, who helped build iOS zero-days for surveillance tools, says he panicked after the alert and immediately replaced his device. The incident may mark the first known case of a spyware/exploit developer becoming a victim, with multiple sources noting similar alerts in recent months. Apple declined to comment. The episode underscores how the zero-day and surveillance market is expanding beyond traditional targets, drawing in more types of victims, from researchers to engineers. Researchers from Citizen Lab and Amnesty have long documented government use of such tools; now, apparent alerts hint at broader exposure even among developers who create them.
  • Apple Nears $4T Market Cap: Is It Catching Up in the AI Race?
    October 21, 2025, 12:52 PM EDT. Television-style dialogue notes that traders see Apple catching up with Nvidia in the AI race, as investors unwind the AI super cycle hype. While Nvidia has benefited from momentum, Apple is seen as a laggard whose core business remains robust. The demand for the iPhone 17 upgrade through 2027 supports the bull case, and Apple's growing connected home devices and production plans in Vietnam could widen its ecosystem. On the other side, Nvidia remains a favorite for many, but some see valuation baked in and favor Apple's more sustainable trajectory. Wall Street remains bullish on Apple's growth, even as debates over AI leadership continue.
  • UC San Diego Unveils Low-Data, Low-Compute Fine-Tuning for LLMs
    October 21, 2025, 12:50 PM EDT. Researchers at UC San Diego have developed a method to fine-tune LLMs with far less data and computing power by updating only the most impactful parameters rather than retraining the whole model. The approach reduces cost, increases flexibility, and improves generalization compared with traditional fine-tuning. They demonstrated the method on protein language models, achieving higher accuracy in predicting peptide passage across the blood-brain barrier using 326 times fewer parameters; and matching full fine-tuning in thermostability predictions with 408 times fewer parameters. As Pengtao Xie notes, this work helps even small labs and startups adapt large AI models, advancing toward democratizing AI. The study appeared in Transactions on Machine Learning Research and was supported by the NSF and NIH.