Toggle light / dark theme

Breakthrough Discovery of New Model for “Global” DNA Repair

Breakthrough techniques in living cells upend field.

Two studies provide a radically new picture of how bacterial cells continually repair damaged sections (lesions) in their DNA.

Led by researchers from NYU Grossman School of Medicine, the work revolves around the delicacy of DNA molecules, which are vulnerable to damage by reactive byproducts of cellular metabolism, toxins, and ultraviolet light. Given that damaged DNA can result in detrimental DNA code changes (mutations) and death, cells evolved to have DNA repair machineries. A major unresolved question in the field, however, is how do these machineries rapidly search for and find rare stretches of damage amid the “vast fields” of undamaged DNA.

Pathways Language Model (PaLM): Scaling to 540 Billion Parameters for Breakthrough Performance

In recent years, large neural networks trained for language understanding and generation have achieved impressive results across a wide range of tasks. GPT-3 first showed that large language models (LLMs) can be used for few-shot learning and can achieve impressive results without large-scale task-specific data collection or model parameter updating. More recent LLMs, such as GLaM, LaMDA, Gopher, and Megatron-Turing NLG, achieved state-of-the-art few-shot results on many tasks by scaling model size, using sparsely activated modules, and training on larger datasets from more diverse sources. Yet much work remains in understanding the capabilities that emerge with few-shot learning as we push the limits of model scale.

Last year Google Research announced our vision for Pathways, a single model that could generalize across domains and tasks while being highly efficient. An important milestone toward realizing this vision was to develop the new Pathways system to orchestrate distributed computation for accelerators. In “PaLM: Scaling Language Modeling with Pathways”, we introduce the Pathways Language Model (PaLM), a 540-billion parameter, dense decoder-only Transformer model trained with the Pathways system, which enabled us to efficiently train a single model across multiple TPU v4 Pods. We evaluated PaLM on hundreds of language understanding and generation tasks, and found that it achieves state-of-the-art few-shot performance across most tasks, by significant margins in many cases.

Introducing, Museum of The Future — The chance to live in 2071!

Another apple of the eye in the face of Dubai.

Dubai’s penchant for housing some of the world’s most magnificent creations is no secret. Beautiful buildings and jaw-dropping structures with breath-taking designs have helped the UAE capital build a solid foundation for its identity as one of the top tourist destinations throughout the globe. On the palindrome date of 22nd February 2022, Dubai added yet another feather in the cap to its stunning collection of architectural marvels as it unveiled the Museum of The Future — a standing tribute to science and technology that will allow the visitors an immersive experience of living the future. It will house some of the world’s most futuristic technologies, ideas, and innovative products.

The spectacular structure of the Museum of The Future is perhaps one of the most complex and complicated designs ever created and willed into solid reality in the history of architecture. So much so that His Highness Sheikh Mohammed bin Rashid Al Maktoum, the ruler of Dubai, has already touted it as ‘the most beautiful building in the world’ to give a tribute to its marvelous design. Talking more about the structure, the museum has an elliptical shape that has invited different symbolic interpretations. Some say the elliptical shape represents humanity and the void represents the unknown future. On the flip side, some have compared the structure to that of the human eye that is looking at the future.

Microsoft Translator enhanced with Z-code Mixture of Experts models

Translator, a Microsoft Azure Cognitive Service, is adopting Z-code Mixture of Experts models, a breakthrough AI technology that significantly improves the quality of production translation models. As a component of Microsoft’s larger XYZ-code initiative to combine AI models for text, vision, audio, and language, Z-code supports the creation of AI systems that can speak, see, hear, and understand. This effort is a part of Azure AI and Project Turing, focusing on building multilingual, large-scale language models that support various production teams. Translator is using NVIDIA GPUs and Triton Inference Server to deploy and scale these models efficiently for high-performance inference. Translator is the first machine translation provider to introduce this technology live for customers.

Z-code MoE boosts efficiency and quality

Z-code models utilize a new architecture called Mixture of Experts (MoE), where different parts of the models can learn different tasks. The models learn to translate between multiple languages at the same time. The Z-code MoE model utilizes more parameters while dynamically selecting which parameters to use for a given input. This enables the model to specialize a subset of the parameters (experts) during training. At runtime, the model uses the relevant experts for the task, which is more computationally efficient than utilizing all model’s parameters.

The Rise of Artificial Intelligence | Wondrium Perspectives

For almost a century, we’ve been intrigued and sometimes terrified by the big questions of artificial intelligence. Will computers ever become truly intelligent? Will the time come when machines can operate without human intervention? What would happen if a machine developed a conscience?

In this episode of Perspectives, six experts in the fields of robotics, sci-fi, and philosophy discuss breakthroughs in the development of AI that are both good, as well as a bit worrisome.

Clips in this video are from the following series on Wondrium:

Mind-Body Philosophy, presented by Patrick Grim.
https://www.wondrium.com/mind-body-philosophy.

Introduction to Machine Learning, presented by Michael L. Litman.
https://www.wondrium.com/introduction-to-machine-learning.

Redefining Reality, presented by Steven Gimbel.

BIG designs virtual office in the metaverse for Vice

Danish architecture studio BIG has designed its first building in the metaverse, a virtual office for employees at media company Vice Media Group called Viceverse.

The recently opened Viceverse office is located on the Decentraland platform, where it will serve as the agency’s virtual innovation lab and allow employees to work in the metaverse on Non Fungible Tokens (NFTs) and other digital projects.

Blue Origin NS-20: Launch date, passenger list, flight details for Pete Davidson trip

In the long term, Bezos hopes to develop the infrastructure that could enable humanity’s biggest goals in spaceflight — similar to how Amazon used innovations like the postal service to power its dreams decades later.

Bezos envisions giant orbiting cities, located close to Earth, that could enable humanity to expand to 1 trillion humans. The cities could feature leisure and recreation, or heavy industry that avoids polluting Earth nearby.

It could all start with flights like NS-20. Here’s what you need to know.

/* */