Toggle light / dark theme

Tiny device mimics human vision and memory abilities

Researchers have created a small device that “sees” and creates memories in a similar way to humans, in a promising step towards one day having applications that can make rapid, complex decisions such as in self-driving cars.

The neuromorphic invention is a enabled by a sensing element, doped indium oxide, that’s thousands of times thinner than a human hair and requires no external parts to operate.

RMIT University engineers in Australia led the work, with contributions from researchers at Deakin University and the University of Melbourne.

Zuckerberg Announces Bold Plan to Jam AI Into “Every Single One of Our Products”

Meta-formerly-Facebook CEO Mark Zuckerberg has a genius new plot to add some interest to Meta-owned products: just jam in some generative AI, absolutely everywhere.

Axios reports that in an all-hands meeting on Thursday, Zuckerberg unveiled a barrage of generative AI tools and integrations, which are to be baked into both Meta’s internal and consumer-facing products, Facebook and Instagram included.

“In the last year, we’ve seen some really incredible breakthroughs — qualitative breakthroughs — on generative AI,” Zuckerberg told Axios in a statement, “and that gives us the opportunity to now go take that technology, push it forward, and build it into every single one of our products.”

Microsoft AI Introduces Orca: A 13-Billion Parameter Model that Learns to Imitate the Reasoning Process of LFMs (Large Foundation Models)

The remarkable zero-shot learning capabilities demonstrated by large foundation models (LFMs) like ChatGPT and GPT-4 have sparked a question: Can these models autonomously supervise their behavior or other models with minimal human intervention? To explore this, a team of Microsoft researchers introduces Orca, a 13-billion parameter model that learns complex explanation traces and step-by-step thought processes from GPT-4. This innovative approach significantly improves the performance of existing state-of-the-art instruction-tuned models, addressing challenges related to task diversity, query complexity, and data scaling.

The researchers acknowledge that the query and response pairs from GPT-4 can provide valuable guidance for student models. Therefore, they enhance these pairs by adding detailed responses that offer a better understanding of the reasoning process employed by the teachers when generating their responses. By incorporating these explanation traces, Orca equips student models with improved reasoning and comprehension skills, effectively bridging the gap between teachers and students.

The research team utilizes the Flan 2022 Collection to enhance Orca’s learning process further. The team samples tasks from this extensive collection to ensure a diverse mix of challenges. These tasks are then sub-sampled to generate complex prompts, which serve as queries for LFMs. This approach creates a diverse and rich training set that facilitates robust learning for the Orca, enabling it to tackle a wide range of tasks effectively.

Here’s what Mark Zuckerberg thinks about Apple’s Vision Pro

Zuckerberg addressed Apple’s headset unveiling in a meeting with Meta employees, telling them that it ‘could be the vision of the future of computing, but like, it’s not the one that I want.’

Mark Zuckerberg doesn’t seem fazed by Apple’s introduction of the Vision Pro.

In a companywide meeting with Meta employees today that The Verge watched, the CEO said Apple’s device didn’t present any major breakthroughs in technology that Meta hadn’t “already explored” and that its vision for how people will use the device is “not the one that I want.” He also pointed to the fact that Meta’s upcoming Quest 3 headset will be much cheaper, at $499 compared to the Vision Pro’s $3,499 price tag, giving Meta… More.


One takeaway: “it costs seven times more.”

Sodium on Steroids: A Nuclear Physics Breakthrough Thought To Be Impossible

Nuclear physicists at RIKEN have successfully created an extremely neutron-rich isotope of sodium, 39 Na, previously predicted by many atomic nuclei models to be non-existent. This discovery has significant implications for our understanding of atomic nuclei structure and the astrophysical processes that form heavier elements on Earth.

Nuclear physicists have made the most neutron-rich form of sodium yet, which will help reveal more about the complex world of nuclei.

Physicists at RIKEN have created an exceptionally neutron-rich sodium isotope, 39 Na, which was previously believed to be impossible. This breakthrough has major implications for understanding atomic nuclei structure and the creation of Earth’s heavier elements.

Intelligence Explosion — Part 2/3

Hallucination!

Can “hallucinations” generate an alternate world, prophesying falsehood?

As I write this article, NVIDIA( is surpassing Wall Street’s expectations. The company, headquartered in Santa Clara, California, has just joined the exclusive club of only five companies in the world valued at over a trillion dollars [Apple (2.7T), Microsoft (2.4T), Saudi Aramco (2T), Alphabet/Google (1.5T), and Amazon (1.2T)], as its shares rose nearly 25% in a single day! A clear sign of how the widespread use of Artificial Intelligence (AI) can dramatically reshape the technology sector.

Intel has announced an ambitious plan to develop scientific generative AIs designed with one trillion parameters. These models will be trained on various types of data, including general texts, code, and scientific information. In comparison, OpenAI’s GPT-3 has 175 billion parameters (the size of GPT-4 has not yet been disclosed by OpenAI). The semiconductor company’s main focus is to apply these AIs in the study of areas such as biology, medicine, climate, cosmology, chemistry, and the development of new materials. To achieve this goal, Intel plans to launch a new supercomputer called Aurora, with processing capacity exceeding two EXAFLOPS(*, later this year.