For as long as computers have existed, physicists have used them as tools to understand, predict and model the natural world. Computing experts, for their part, have used advances in physics to develop machines that are faster, smarter and more ubiquitous than ever. This collection celebrates the latest phase in this symbiotic relationship, as the rise of artificial intelligence and quantum computing opens up new possibilities in basic and applied research
Manufacturing silicon qubits at scale
As quantum computing matures, will decades of engineering give silicon qubits an edge? Fernando Gonzalez-Zalba, Tsung-Yeh Yang and Alessandro Rossi think so
Small computers find an industrial niche
Physicist and Raspberry Pi inventor Eben Upton explains how simple computers are becoming integral to the Internet of Things
30 years of the web
Challenges of interdisciplinary physics and the Web at 30
Physics World journalists discuss the week’s highlights
Vague but exciting: how the Web transformed business
James McKenzie explains how Tim Berners-Lee's invention of the World Wide Web at CERN has revolutionized how we trade.
Electronic publishing and visions of hypertext
Tim Berners-Lee predicts the future of online publishing in an article he wrote for Physics World in 1992
Illustrating 30 years of the Web
Jess Wade illustrates the history of the World Wide Web, from the technology that enabled it to the staple it is today
The future of the Internet
Emerging technologies shaping our connected world
Physics World 30th anniversary podcast series – 30 years of the World Wide Web
Fifth episode in mini-series revisits the birth of the Web and the challenges it now faces
The third pillar of science
Computing is transforming scientific research, but are researchers and software code adapting at the same rate? Benjamin Skuse finds out
Simulations reveal new insights
How scientific models both help and deceive us in decision making
Michela Massimi reviews Escape from Model Land by Erica Thompson
Particle physicists get AI help with beam dynamics
New machine learning algorithm accurately reconstructs the shapes of particle accelerator beams from tiny amounts of data
Climate-change ‘fingerprint’ is identified in the upper atmosphere
Observations agree with computer simulations of global warming
Threshold for X-ray flashes from lightning is identified by simulations
Research could lead to new types of X-ray sources
‘More than Moore’: a glimpse at the future of computing
Available to watch now, IOP Publishing explores what lies beyond the era of Moore’s law, and to look at some of the technologies that could play roles in the computers of the future
‘Forest of cylindrical obstacles’ slows avalanche flow
New model of downslope granular movement could reduce the destructive power of avalanches and other dangerous geophysical phenomena
Machine learning reveals new science
Deep learning helps radiologists detect lung cancer on chest X-rays
Introducing artificial intelligence into the clinical workflow helps radiologists detect lung cancer lesions on chest X-rays and dismiss false-positives
Machine learning puts nanomaterials in the picture
Algorithms help materials scientists recognize patterns in structure-function relationships
Deep learning algorithm helps diagnose neurological emergencies
A deep learning algorithm detects brain haemorrhages on head CT scans with comparable performance to highly trained radiologists
Artificial intelligence helps detect atrial fibrillation
An artificial intelligence model can identify patients with intermittent atrial fibrillation from scans performed during normal heart rhythm
Machine learning is implemented on an IBM quantum processor
Proof-of-concept demonstration done using two superconducting qubits
AI framework uses medical images to individualize radiotherapy dose
An image-based artificial intelligence framework predicts a personalized radiation dose that minimizes the risk of treatment failure
AI predicts coma outcome from EEG trace
A machine learning algorithm can read electroencephalograms as well as clinicians
The latest in quantum computing
IBM’s 127-qubit processor shows quantum advantage without error correction
Quantum error mitigation used to calculate 2D Ising model
Real-time error correction extends the lifetime of quantum information
Researchers perform the first experimental demonstration showing a significant improvement to the lifetime of information in a quantum system, using error correction
Breakthrough in quantum error correction could lead to large-scale quantum computers
Errors can be reduced by increasing number of qubits
Quantum processors still struggle to simulate complex molecules
Latest results show that classical computers retain an edge when simulating quantum chemistry problems – at least for now
Quantum teleportation opens a ‘wormhole in space–time’
String theory and quantum gravity could be put to the test by quantum processor
Six-qubit silicon quantum processor sets a record
Advances in calibration routines and device fabrication lead to high-fidelity operations
Mathematics and computation – One location for research in your subject area
Featuring world-leading journals, news and books, dedicated to supporting and improving research across the field, from fundamental science through to novel applications and facilities.
Related jobs
Related events
- Materials | Meeting Faraday Discussion – Electrosynthesis 12—14 July 2023 | Edinburgh, UK
- Mathematics and computation | Conference International Conference on Data Science, AI and Analytics: Bridging the Gap Between Theory and Practices (ICDSAIA-2023) 13—14 September 2023 | Subang Jaya, Malaysia
- Mathematics and computation | Workshop Non-autonomous Dynamics in Complex Systems: Theory and Applications to Critical Transitions 9—27 October 2023 | Dresden, Germany