algorithm sort by issue
The brain contains about 100 billion microscopic cells called neurons which together can generate enough electricity to power a low-wattage bulb. Scientists, researchers as well as forward-looking tech companies are investigating ways to use that power to control devices remotely.
In the case of healthcare, medical devices and systems can help save lives and improve quality of life for people living with different conditions and diseases. Machine learning, natural language processing and image recognition facilitate the monitoring, analysis, diagnosis and treatment of patients.
Engineers and scientists around the world are racing to build quantum computing devices capable of achieving quantum supremacy, which is broadly defined as solving problems that today’s computers cannot. Quantum devices will eventually have processing power that overshadows anything that contemporary supercomputers can achieve. They are expected to bring massive benefits, such as accelerating medical research, making advances in artificial intelligence and perhaps even finding answers to climate change.
Over the last century, automation has advanced in many industries. More recently people must work with non-human entities, which increasingly use artificial intelligence (AI) technologies.
In 2019, a day of data includes 500 million tweets, 294 billion emails, four terabytes produced by a connected car, 65 billion messages sent over WhatsApp and two billion minutes of voice and video calls made, five billion searches and 95 million photos and videos shared on Instagram, according to research by Raconteur data journalism specialists. By 2020 it is expected that wearable devices will produce 28 petabytes (1000⁵ bytes) of data.
In parts of Asia, North and South America, Europe and Africa, digital technologies are enabling students to learn more effectively and from entirely new perspectives.
The Internet of Things (IoT), increased connectivity and advances in artificial intelligence (AI) technologies, such as algorithms and machine learning are enabling industries to streamline processes, improve efficiency and reduce costs as they become more digitized.
Today, for many, technology is an inextricable part of life and healthcare. Friendly robots administer daily medications; algorithms diagnose diseases more accurately than top specialists, and a doctor’s appointment can happen over skype.
Wael Diab, who is leading international efforts to standardize artificial intelligence (AI), has identified the mitigation of data bias as a priority challenge for eventual future standards work. Diab recently told the IEC General Meeting, in Busan, South Korea, that a broad standardization approach is necessary.
Imagine being able to predict medical conditions in healthy people and take steps to prevent them before symptoms develop, or having fully autonomous systems monitor critical patients in intensive care units instead of requiring a team of specialists.
In 2007, when cautious doctors replaced a former US Vice President’s heart defibrillator, a battery-powered device placed under the skin to monitor heart rate, they modified it so it couldn't be hacked by terrorists, by having the manufacturer disable the wireless feature.
Information technology has become an integral part of our lives whether it be in the consumer, industrial or commercial aspects. It is hard to imagine life, work or entertainment without it. Artificial intelligence (AI) presents the next digital frontier of the IT evolution.
- conformity assessment (278)
- JTC1 (126)
- sensors (105)
- safety (104)
- IECQ (96)
- IoT (96)
- IECEE (93)
- IECEx (93)
- energy efficiency (68)
- cyber security (67)
- renewable energy (58)
- International Standards (56)
- electronic components (53)
- batteries (46)
- healthcare (46)
- explosive atmospheres (45)
- Smart Cities (43)
- internet of things (43)
- IECRE (42)
- AI (41)