scorecardresearch

New age smartphone cameras: Not just selfies, detect anaemia, check food quality and more

If you thought a smartphone camera was good enough to only click selfies, think again. From sensing food quality to screening for anaemia, there’s much more to the new-age smartphone camera than meets the eye.

New age smartphone cameras: Not just selfies, detect anaemia, check food quality and more
Researchers across the globe today have tweaked and developed smartphone cameras into tools that can do all this and more. (Reuters)

THE LATEST report doing the rounds regarding the launch of iPhone 8 some time next year is that Apple is working with LG to develop a smartphone camera module that would enable 3D photography. This news is bound to send selfie-lovers into raptures. But while 3D photography on a smartphone, no doubt, is an exciting prospect, what if we were to tell you that there’s much more that some new-age smartphone cameras can achieve? How about using the camera on your phone to sense food quality, for instance? What if it could screen you for anaemia or help you brush your teeth better? Researchers across the globe today have tweaked and developed smartphone cameras into tools that can do all this and more.

Detecting melanoma

Melanoma is one of the deadliest forms of skin cancer, but if detected in the early stages, the chances of survival are remarkably high. Now, researchers are working on devising a new way to detect the disease not with blood tests, but with a smartphone camera. In an IBM Research blog post, cognitive computing and computer vision scientist Noel Codella mentions how, in the future, doctors and patients will be able to identify possible signs of melanoma with the help of skin image analysis. “Our vision is that taking pictures to diagnose melanoma might one day be as routine as drawing blood to detect other diseases. Equipped with a smartphone or any other camera attached to a dermascope, this type of technology would enable doctors, nurses or support staff to take a photo of a lesion, send it to a cloud-based analytics service and receive a detailed report,” Codella writes in the blog post.

This report, Codella writes, could help a clinician determine whether the lesion has characteristics of melanoma or any other type of skin cancer. It would also contain other relevant information, which might further assist the medical staff in assessing a patient.

Once implemented in clinical practice, this technology could be a game-changer since there are currently no reliable blood tests for melanoma. In 2015, the IBM Research team published preliminary research on this topic in which they explained the ability of computer vision approaches to help find disease markers in dermoscopy images. They have also directly compared the performance of these new methods to that of eight specialists. A well-established dermatologist, writes Codella, sees 25 or more patients per day. Assuming there are approximately 20 workdays in a month, this results in over 6,000 patients per year, with potentially multiple lesions observed in each patient. By comparison, writes Codella, the IBM system has seen fewer than 3,000 lesions, meaning its experience level is along the lines of a medical student. “But the hope is that, one day, the simple act of taking a picture could help clinicians quickly and accurately test for melanoma and other dangerous skin diseases,” Codella writes. The technology, however, remains a work in progress, as it requires more research and testing.

teeth
The Genius 9000 toothbrush, developed by the Oral-B brand of Procter & Gamble, teaches people to brush more effectively with the help of their smartphone camera.

Brushing teeth better

The same camera that lets you click that picture-perfect selfie can now help you brush your teeth better. Wondering how? The Genius 9000 toothbrush, developed by the Oral-B brand of Procter & Gamble, teaches people to brush more effectively with the help of their smartphone camera. The toothbrush is fitted with a set of accelerometers that measure changes in motion in three dimensions, calculating even minute changes in brush angles. This is where the smartphone camera comes into the picture. All you need to do is download an app, created by P&G scientists and the Fraunhofer Society for the Advancement of Applied Research, on your smartphone. This app uses the smartphone’s camera and facial recognition technology to measure the position of the brush relative to the mouth—Oral-B has termed this ‘position detection’. Users just have to place their smartphone in front of them when they brush—they can attach it to the bathroom mirror using a suction cup holder that comes with the toothbrush, which is available online (R32,000). The app then shows a user which parts of the mouth have been brushed properly and which areas need more work.

When a user starts brushing, the app displays a circle in six blue segments corresponding to the six areas of the mouth. As you brush each area, the blue fades, leaving a clean white segment. The app also remembers the user’s past 30 brushing sessions. The smartphone app that links to the brush can also be programmed by your dental professional to issue notes and reminders during and after a brushing session. But does it work? Well, initial data for the Oral-B Genius 9000 shows that some users now brush for 2 minutes and 24 seconds on an average, which is more than the recommended time of two minutes twice a day.

mobile
Japanese tech giant Hitachi has developed a technology in which the smartphone camera can be used for such authentication.

Finger vein reading

Finger vein reading is a popular method of biometric authentication. A type of infrared light is transmitted through the finger and partially absorbed by hemoglobin in the veins to capture a unique finger vein pattern profile, which is then matched with a pre-registered profile to verify individual identity. Now, Japanese tech giant Hitachi has developed a technology in which the smartphone camera can be used for such authentication. As per the company, this is the first practical finger vein authentication technology that uses a phone’s camera. The use of colour information software makes this technology more effective than the infrared variant. Additionally, multiple fingers can be used in this technology, which improves accuracy—the colour and shape of fingers are registered in advance.

The technology can even extract fingers from a background image. There’s also the provision of normalising finger postures, so that authentication can be done even when the angle of the hand changes. Hitachi tested the technology in-house. Pictures were taken using an 8-MP camera and transferred to a tablet computer for authentication. And even though the technology was used with Android, the company has plans to support iOS as well when it’s ready for commercial use. As per the company, this technology can replace the traditional means of authentication when it comes to financial transactions, online shopping and so on.

What makes finger vein authentication more feasible is the security aspect. Biometric authentication methods such as fingerprints, voiceprint and facial recognition are prone to spoofing and can be forged easily. But finger vein authentication is impervious to security threats because one can’t possibly replicate vein patterns. Also, at the time of commercialisation, Hitachi plans to use a smartphone or a cloud server to carry out the authentication instead of a tablet.

 The answer comes from the University of Washington, where scientists have developed an application called HemaApp, which uses a smartphone camera (and the flash) to estimate haemoglobin concentration and screen for anaemia.
The answer comes from the University of Washington, where scientists have developed an application called HemaApp, which uses a smartphone camera (and the flash) to estimate haemoglobin concentration and screen for anaemia.

Detecting anaemia

Anaemia remains a major problem in developing countries. Take India, for instance. A recent National Family Health Survey found that over 55% of Indian women were anaemic. But how does a smartphone camera enter the frame? The answer comes from the University of Washington, where scientists have developed an application called HemaApp, which uses a smartphone camera (and the flash) to estimate haemoglobin concentration and screen for anaemia.

All one has to do is focus the light from the camera’s flash on to a patient’s finger. HemaApp then analyses the colour of the blood. It continuously floods a patient’s finger with different wavelengths of light and infrared energy, and creates a series of videos. By analysing how colours are absorbed and reflected across those wavelengths, HemaApp can detect the concentration of haemoglobin and other blood components such as plasma.

Researchers say they tested the app under three different conditions: using the smartphone camera’s flash alone; in combination with a common incandescent light bulb; and with a low-cost LED lighting attachment. The app correctly identified cases of low haemoglobin levels 79% of the time when used with just the phone camera. This number went up to 86% when the app was used with some additional light source. To ensure it worked on different skin tones, the team developed processing algorithms that use a patient’s pulse to differentiate between the properties of the patient’s blood and the physical characteristics of his/her finger, the paper said.

As per the university website, HemaApp is not meant to replace traditional blood tests—which are more accurate—but researchers do believe that the app can be an effective and affordable initial screening tool to determine whether further blood testing is needed or not. As of now, the app is set for more testing.

Some features, like fruit freshness, can also be seen from the spectral image without algorithms, but most applications usually require processing of the data.
Some features, like fruit freshness, can also be seen from the spectral image without algorithms, but most applications usually require processing of the data.

Check health, food quality

Researchers from VTT Technical Research Centre in Finland have developed the world’s first hyperspectral mobile device by converting the camera on an iPhone into an optical sensor that can sense food quality, as well as monitor health. As per the VTT website, the cost-effective optical MEMS (micro opto electro mechanical systems) spectral technology enables the development of new mobile applications for environmental sensing and monitoring vehicles and drones—other uses include food analysis and health monitoring. “Consumer benefits could appear in health applications such as mobile phones that are able to check whether food is edible or moles are malignant,” says Anna Rissanen, research team leader, VTT.

You May Also Want To Watch:

[jwplayer W1Zq0xh5-DE6UeepY]

Hyperspectral imaging provides access to the optical spectrum at each point of an image, enabling a wide range of measurements. The adjustable MEMS filter is integrated with the camera lens and its adjustment is synchronised with the camera’s image capture system. The hyperspectral sensor measures the data in each camera pixel of the target. This spectral data can be transformed into information about the measurement target by developing intelligent computational algorithms. Some features, like fruit freshness, can also be seen from the spectral image without algorithms, but most applications usually require processing of the data.

The sensor can also verify product authenticity or identify users based on biometric data. The VTT website adds that the centre aims to work with companies to commercialise the tech and bring new innovative optical sensor products to the market in the future. They have already developed a wide range of new applications, including environmental sensing based on nano-satellites, drone applications for precision agriculture, forest monitoring, etc.

Get live Share Market updates and latest India News and business news on Financial Express. Download Financial Express App for latest business news.

First published on: 08-01-2017 at 06:04 IST