Blog

Archive for the ‘information science’ category: Page 171

Aug 15, 2021

UAT Virtual Let’s Talk Tech Open House

Posted by in categories: bioengineering, biological, genetics, information science, internet, robotics/AI

Learn More


University of Advancing Technology’s Artificial Intelligence (AI) degree explores the theory and practice of engineering tools that simulate thinking, patterning, and advanced decision behaviors by software systems. With inspiration derived from biology to design, UAT’s Artificial Intelligence program teaches students to build software systems that solve complex problems. Students will work with technologies including voice recognition, simulation agents, machine learning (ML), and the internet of things (IoT).

Students pursuing this specialized computer programming degree develop applications using evolutionary and genetic algorithms, cellular automata, artificial neural networks, agent-based models, and other artificial intelligence methodologies. UAT’s degree in AI covers the fundamentals of general and applied artificial intelligence including core programming languages and platforms used in computer science.

Continue reading “UAT Virtual Let’s Talk Tech Open House” »

Aug 14, 2021

New Algorithm Trains Drones To Fly Around Obstacles at High Speeds

Posted by in categories: drones, information science, robotics/AI

New algorithm could enable fast, nimble drones for time-critical operations such as search and rescue.

If you follow autonomous drone racing, you likely remember the crashes as much as the wins. In drone racing, teams compete to see which vehicle is better trained to fly fastest through an obstacle course. But the faster drones fly, the more unstable they become, and at high speeds their aerodynamics can be too complicated to predict. Crashes, therefore, are a common and often spectacular occurrence.

Continue reading “New Algorithm Trains Drones To Fly Around Obstacles at High Speeds” »

Aug 13, 2021

Progress in algorithms makes small, noisy quantum computers viable

Posted by in categories: information science, quantum physics, robotics/AI

As reported in a new article in Nature Reviews Physics, instead of waiting for fully mature quantum computers to emerge, Los Alamos National Laboratory and other leading institutions have developed hybrid classical/quantum algorithms to extract the most performance—and potentially quantum advantage—from today’s noisy, error-prone hardware. Known as variational quantum algorithms, they use the quantum boxes to manipulate quantum systems while shifting much of the work load to classical computers to let them do what they currently do best: solve optimization problems.

“Quantum computers have the promise to outperform for certain tasks, but on currently available quantum hardware they can’t run long algorithms. They have too much noise as they interact with environment, which corrupts the information being processed,” said Marco Cerezo, a physicist specializing in , quantum machine learning, and quantum information at Los Alamos and a lead author of the paper. “With variational , we get the best of both worlds. We can harness the power of quantum computers for tasks that classical computers can’t do easily, then use classical computers to compliment the computational power of quantum devices.”

Current noisy, intermediate scale quantum computers have between 50 and 100 qubits, lose their “quantumness” quickly, and lack error correction, which requires more qubits. Since the late 1990s, however, theoreticians have been developing algorithms designed to run on an idealized large, error-correcting, fault tolerant quantum computer.

Aug 13, 2021

Classical variational simulation of the Quantum Approximate Optimization Algorithm

Posted by in categories: computing, information science, quantum physics

In this work, we introduce a classical variational method for simulating QAOA, a hybrid quantum-classical approach for solving combinatorial optimizations with prospects of quantum speedup on near-term devices. We employ a self-contained approximate simulator based on NQS methods borrowed from many-body quantum physics, departing from the traditional exact simulations of this class of quantum circuits.

We successfully explore previously unreachable regions in the QAOA parameter space, owing to good performance of our method near optimal QAOA angles. Model limitations are discussed in terms of lower fidelities in quantum state reproduction away from said optimum. Because of such different area of applicability and relative low computational cost, the method is introduced as complementary to established numerical methods of classical simulation of quantum circuits.

Classical variational simulations of quantum algorithms provide a natural way to both benchmark and understand the limitations of near-future quantum hardware. On the algorithmic side, our approach can help answer a fundamentally open question in the field, namely whether QAOA can outperform classical optimization algorithms or quantum-inspired classical algorithms based on artificial neural networks48,49,50.

Aug 12, 2021

The Surprising Genius of 3D Printed Rockets

Posted by in categories: engineering, information science, space travel

3D printed rockets save on up front tooling, enable rapid iteration, decrease part count, and facilitate radically new designs. For your chance to win 2 seats on one of the first Virgin Galactic flights to Space and support a great cause, go to https://www.omaze.com/veritasium.

Thanks to Tim Ellis and everyone at Relativity Space for the tour!
https://www.relativityspace.com/
https://youtube.com/c/RelativitySpace.

Continue reading “The Surprising Genius of 3D Printed Rockets” »

Aug 11, 2021

Faced with a Data Deluge, Astronomers Turn to Automation

Posted by in categories: information science, robotics/AI, space

Circa 2019


For better or worse, machine learning and big data are poised to transform the study of the heavens.

Aug 10, 2021

System trains drones to fly around obstacles at high speeds

Posted by in categories: drones, information science, robotics/AI

For drone racing enthusiasts. 😃


If you follow autonomous drone racing, you likely remember the crashes as much as the wins. In drone racing, teams compete to see which vehicle is better trained to fly fastest through an obstacle course. But the faster drones fly, the more unstable they become, and at high speeds their aerodynamics can be too complicated to predict. Crashes, therefore, are a common and often spectacular occurrence.

Continue reading “System trains drones to fly around obstacles at high speeds” »

Aug 9, 2021

Machine learning plus insights from genetic research shows the workings of cells – and may help develop new drugs for COVID-19 and other diseases

Posted by in categories: biological, biotech/medical, genetics, information science, robotics/AI

We combined a machine learning algorithm with knowledge gleaned from hundreds of biological experiments to develop a technique that allows biomedical researchers to figure out the functions of the proteins that turn genes on and off in cells, called transcription factors. This knowledge could make it easier to develop drugs for a wide range of diseases.

Early on during the COVID-19 pandemic, scientists who worked out the genetic code of the RNA molecules of cells in the lungs and intestines found that only a small group of cells in these organs were most vulnerable to being infected by the SARS-CoV-2 virus. That allowed researchers to focus on blocking the virus’s ability to enter these cells. Our technique could make it easier for researchers to find this kind of information.

The biological knowledge we work with comes from this kind of RNA sequencing, which gives researchers a snapshot of the hundreds of thousands of RNA molecules in a cell as they are being translated into proteins. A widely praised machine learning tool, the Seurat analysis platform, has helped researchers all across the world discover new cell populations in healthy and diseased organs. This machine learning tool processes data from single-cell RNA sequencing without any information ahead of time about how these genes function and relate to each other.

Aug 9, 2021

Twitter AI bias contest shows beauty filters hoodwink the algorithm

Posted by in categories: information science, robotics/AI

The service’s algorithm for cropping photos favors people with slimmer, younger faces and lighter skin.

Aug 7, 2021

AI Wrote Better Phishing Emails Than Humans in a Recent Test

Posted by in categories: cybercrime/malcode, government, information science, robotics/AI

Natural language processing continues to find its way into unexpected corners. This time, it’s phishing emails. In a small study, researchers found that they could use the deep learning language model GPT-3, along with other AI-as-a-service platforms, to significantly lower the barrier to entry for crafting spearphishing campaigns at a massive scale.

Researchers have long debated whether it would be worth the effort for scammers to train machine learning algorithms that could then generate compelling phishing messages. Mass phishing messages are simple and formulaic, after all, and are already highly effective. Highly targeted and tailored “spearphishing” messages are more labor intensive to compose, though. That’s where NLP may come in surprisingly handy.

At the Black Hat and Defcon security conferences in Las Vegas this week, a team from Singapore’s Government Technology Agency presented a recent experiment in which they sent targeted phishing emails they crafted themselves and others generated by an AI-as-a-service platform to 200 of their colleagues. Both messages contained links that were not actually malicious but simply reported back clickthrough rates to the researchers. They were surprised to find that more people clicked the links in the AI-generated messages than the human-written ones—by a significant margin.