Did you know your Pixel had this? Did you know any phone had this? Google now wants you to know about it.
Did you know your Pixel had this? Did you know any phone had this? Google now wants you to know about it.
ETH Computer scientists have developed a new AI solution that enables touchscreens to sense with eight times higher resolution than current devices. Thanks to AI, their solution can infer much more precisely where fingers touch the screen.
Quickly typing a message on a smartphone sometimes results in hitting the wrong letters on the small keyboard or on other input buttons in an app. The touch sensors that detect finger input on the touch screen have not changed much since they were first released in mobile phones in the mid-2000s.
In contrast, the screens of smartphones and tablets are now providing unprecedented visual quality, which is even more evident with each new generation of devices: higher color fidelity, higher resolution, crisper contrast. A latest-generation iPhone, for example, has a display resolution of 2532×1170 pixels. But the touch sensor it integrates can only detect input with a resolution of around 32×15 pixels—that’s almost 80 times lower than the display resolution: “And here we are, wondering why we make so many typing errors on the small keyboard? We think that we should be able to select objects with pixel accuracy through touch, but that’s certainly not the case,” says Christian Holz, ETH computer science professor from the Sensing, Interaction & Perception Lab (SIPLAB) in an interview in the ETH Computer Science Department’s “Spotlights” series.
Tokyo (AFP)
Paralysed from the neck down, the man stares intently at a screen. As he imagines handwriting letters, they appear before him as typed text thanks to a new brain implant.
The 65-year-old is “typing” at a speed similar to his peers tapping on a smartphone, using a device that could one day help paralysed people communicate quickly and easily.
Researchers in Singapore have found a way of controlling a Venus flytrap using electric signals from a smartphone, an innovation they hope will have a range of uses from robotics to employing the plants as environmental sensors.
Luo Yifei, a researcher at Singapore’s Nanyang Technological University (NTU), showed in a demonstration how a signal from a smartphone app sent to tiny electrodes attached to the plant could make its trap close as it does when catching a fly.
“Plants are like humans, they generate electric signals, like the ECG (electrocardiogram) from our hearts,” said Luo, who works at NTU’s School of Materials Science and Engineering.
The findings could lead to faster, more secure memory storage, in the form of antiferromagnetic bits.
When you save an image to your smartphone, those data are written onto tiny transistors that are electrically switched on or off in a pattern of “bits” to represent and encode that image. Most transistors today are made from silicon, an element that scientists have managed to switch at ever-smaller scales, enabling billions of bits, and therefore large libraries of images and other files, to be packed onto a single memory chip.
But growing demand for data, and the means to store them, is driving scientists to search beyond silicon for materials that can push memory devices to higher densities, speeds, and security.
The fight against gadget waste is being spurred on by new printable electronics made from wood ink.
The pace of those improvements has slowed, but International Business Machines Corp on Thursday said that silicon has at least one more generational advance in store.
IBM introduced what it says is the world’s first 2-nanometer chipmaking technology. The technology could be as much as 45% faster than the mainstream 7-nanometer chips in many of today’s laptops and phones and up to 75% more power efficient, the company said.
In work that could someday turn cell phones into sensors capable of detecting viruses and other minuscule objects, MIT researchers have built a powerful nanoscale flashlight on a chip.
Their approach to designing the tiny light beam on a chip could also be used to create a variety of other nano flashlights with different beam characteristics for different applications. Think of a wide spotlight versus a beam of light focused on a single point.
For many decades, scientists have used light to identify a material by observing how that light interacts with the material. They do so by essentially shining a beam of light on the material, then analyzing that light after it passes through the material. Because all materials interact with light differently, an analysis of the light that passes through the material provides a kind of “fingerprint” for that material. Imagine doing this for several colors — i.e., several wavelengths of light — and capturing the interaction of light with the material for each color. That would lead to a fingerprint that is even more detailed.
The Enten headphones, which uses a smartphone app, have been developed by US firm Neurable. They can create music playlists based on which songs seem to help the user concentrate dailystar.