↩ Accueil

Vue lecture

The Guardian view on Australia’s social media ban: dragging tech companies into action | Editorial

Children under the age of 16 needed protecting and the moral argument wasn’t winning. Government regulation can change the terms of debate

On 10 December, the world watched as Australia enacted the first social media ban for under-16s. Whether it will have the desired effect of improving young people’s lives we are yet to find out. But what the ban has achieved already is clear.

Many politicians, along with academics and philosophers, have noted that self-regulation has not been an effective safeguard against the harms of social media – especially when the bottom line for people like Mark Zuckerberg and Elon Musk depends on keeping eyes on screens. For too long, these companies resisted, decrying censorship and prioritising “free speech” over moderation. The Australian government decided waiting was no longer an option. The social media ban and similar regulation across the world is now dragging tech companies kicking and screaming toward change. That it has taken the force of the law to ensure basic standards – such as robust age verification, teen-friendly user accounts and deactivation where appropriate – are met shows the moral argument alone was not enough.

Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

Continue reading...

© Photograph: Halfpoint Images/Getty Images

© Photograph: Halfpoint Images/Getty Images

© Photograph: Halfpoint Images/Getty Images

  •  

UnitedHealth reduced hospitalizations for nursing home seniors. Now it faces wrongful death claims

The company says it is protecting nursing home residents by curbing unnecessary hospital transfers. Whistleblowers allege cost-cutting tactics have endangered the elderly

Three nursing home residents died because employees of the American healthcare giant UnitedHealth Group helped delay or deny them critical hospital care, two pending lawsuits and a complaint to state authorities have alleged.

The three cases involve a UnitedHealth partnership initiative that places medical staff from the company’s direct care unit, Optum, inside nursing homes to care for residents insured by the company’s insurance arm.

Continue reading...

© Illustration: Angelica Alzona/Guardian Design

© Illustration: Angelica Alzona/Guardian Design

© Illustration: Angelica Alzona/Guardian Design

  •  

Why young people are the big losers in Europe’s dysfunctional housing system

The EU has unveiled its first-ever housing strategy, but is it enough to see off the far right and rescue a generation shut out of affordable living?

Donald Trump may rage about Europe being a multicultural hell facing “civilisational” collapse. As a proud real estate guy, however, he must be impressed by one feature of European life: the house prices, and the extent to which even progressive governments have abandoned housing to the markets.

Since 2010, average sale prices in the EU have surged by close to 60%. In some countries, such as the Netherlands, house prices have doubled in a decade. Rents, meanwhile, have increased by almost 30% on average in the last 15 years. The rent average masks dramatic spikes experienced in some countries: 208% in Estonia, 177% in Lithuania, 108% in Ireland and 107% in Hungary. If property has been a lucrative bet for wealthy investors, the cost of a home is a financial ordeal for millions of people whose incomes have been outpaced.

Continue reading...

© Photograph: Hollandse Hoogte/REX/Shutterstock

© Photograph: Hollandse Hoogte/REX/Shutterstock

© Photograph: Hollandse Hoogte/REX/Shutterstock

  •  

How the Rhinelander Trial Scandalized the Jazz Age

Rhinelander v. Rhinelander was one of the most scandalous trials of the Jazz Age. 100 years later, it reads as a tragedy about the country’s original sin.

© Getty Images

Alice Rhinelander, center, with her sisters in the courtroom during the annulment trial.
  •  

Gen Z behind jump in use of oral nicotine pouches across Great Britain

More than half a million people now consume products as experts link rise to ‘aggressive’ marketing and advertising

More than 500,000 people in Great Britain now use nicotine pouches, with the significant rise in uptake driven by members of gen Z, research has revealed.

Nicotine pouches are placed between the lip and gum to slowly release nicotine and come in a wide variety of different flavours. Health experts say the products, which are banned in Germany and the Netherlands, should not be used by anyone who does not already smoke.

Continue reading...

© Photograph: Peter Dazeley

© Photograph: Peter Dazeley

© Photograph: Peter Dazeley

  •  

Rise of the full nesters: what life is like with adult children who just can’t leave home

In the UK, close to half of 25-year-olds now live with parents who, in many cases, would expect their nest to have long since emptied. How does this change families, for good and bad?

If life had worked out differently, Serena would by now be coming to terms with an empty nest. Having brought up seven children, she and her husband might even have been enjoying a little more money and time for themselves. But as it is, three of their adult children are now at home: the 23-year-old finishing his degree; the 28-year-old, a teacher, saving for a house deposit; and the 34-year-old, after a mental health crisis. At 63, Serena comes home from her job as a social worker to a mountain of laundry, and a spare downstairs room requisitioned as a bedroom.

Having a houseful is “really good fun”, she says, and makes life richer and more interesting. But it took a while to get used to partners staying over – “I’m not a prude, but you don’t necessarily want to be part of that life for your children, do you?” – and lately, she has felt the lack of an important rite of passage. “I’ve become old and I never really felt it, because I’ve been in that parent mode for such a long time,” she says. “It’s suddenly hit me that I didn’t have that transition that often happens, with kids who leave when you’re in your 40s and 50s – that just hasn’t happened. It’s odd.”

Continue reading...

© Illustration: Pat Thomas/The Guardian

© Illustration: Pat Thomas/The Guardian

© Illustration: Pat Thomas/The Guardian

  •  

The Army Made a Blind Black Soldier a Surrogate for Robert E. Lee

For more than a century, this Black soldier from Virginia was remembered by nearly no one. Then this year, someone at the Pentagon found a use for him.

© National Museum of African American History and Culture

Pvt. Fitz Lee in 1899, with his Medal of Honor on his jacket.
  •  

Jo Ann Allen Boyce Dies at 84; Braved Mobs in Integrating a School

She was one of the Clinton 12, Black students who broke a race barrier by entering a Tennessee high school in 1956 in the face of harassment by white segregationists.

© Don Cravens/Getty Images

Jo Ann Allen, left, and Minnie Ann Dickey at Clinton High School in Tennessee in September 1956, two years after the U.S. Supreme Court outlawed segregation in public schools.
  •  

Semiconductor laser pioneer Susumu Noda wins 2026 Rank Prize for Optoelectronics

Susumu Noda of Kyoto University has won the 2026 Rank Prize for Optoelectronics for the development of the Photonic Crystal Surface Emitting Laser (PCSEL). For more than 25 years, Noda developed this new form of laser, which has potential applications in high-precision manufacturing as well as in LIDAR technologies.

Following the development of the laser in 1960, in more recent decades optical fibre lasers and semiconductor lasers have become competing technologies.

A semiconductor laser works by pumping an electrical current into a region where an n-doped (excess of electrons) and a p-doped (excess of “holes”) semiconductor material meet, causing electrons and holes to combine and release photons.

Semiconductors have several advantages in terms of their compactness, high “wallplug” efficiency, and ruggedness, but lack in other areas such as having a low brightness and functionality.

This means that conventional semiconductor lasers required external optical and mechanical elements to improve their performance, which results in large and impractical systems.

‘A great honour’

In the late 1990s, Noda began working on a new type of semiconductor laser that could challenge the performance of optical fibre lasers. These so-called PCSELs employ a photonic crystal layer  in between the semiconductor layers. Photonic crystals are nanostructured materials in which a periodic variation of the dielectric constant — formed, for example, by a lattice of holes — creates a photonic band-gap.

Noda and his research made a series of breakthrough in the technology such as demonstrating control of polarization and beam shape by tailoring the phonic crystal structure and expansion into blue–violet wavelengths.

The resulting PCSELs emit a high-quality, symmetric beam with narrow divergence and boast high brightness and high functionality while maintaining the benefits of conventional semiconductor lasers. In 2013, 0.2 W PCSELs became available and a few years later Watt-class PCSEL lasers became operational.

Noda says that it is “a great honour and a surprise” to receive the prize. “I am extremely happy to know that more than 25 years of research on photonic-crystal surface-emitting lasers has been recognized in this way,” he adds. “I do hope to continue to further develop the research and its social implementation.”

Susumu Noda received his BSc and then PhD in electronics from Kyoto University in 1982 and 1991, respectively. From 1984 he also worked at Mitsubishi Electric Corporation, before joining Kyoto University in 1988 where he is currently based.

Founded in 1972 by the British industrialist and philanthropist Lord J Arthur Rank, the Rank Prize is awarded biennially in nutrition and optoelectronics. The 2026 Rank Prize for Optoelectronics, which has a cash award of £100 000, will be awarded formally at an event held in June.

The post Semiconductor laser pioneer Susumu Noda wins 2026 Rank Prize for Optoelectronics appeared first on Physics World.

  •  

Tim Berners-Lee: why the inventor of the Web is ‘optimistic, idealistic and perhaps a little naïve’

It’s rare to come across someone who’s been responsible for enabling a seismic shift in society that has affected almost everyone and everything. Tim Berners-Lee, who invented the World Wide Web, is one such person. His new memoir This is for Everyone unfolds the history and development of the Web and, in places, of the man himself.

Berners-Lee was born in London in 1955 to parents, originally from Birmingham, who met while working on the Ferranti Mark 1 computer and knew Alan Turing. Theirs was a creative, intellectual and slightly chaotic household. His mother could maintain a motorbike with fence wire and pliers, and was a crusader for equal rights in the workplace. His father – brilliant and absent minded – taught Berners-Lee about computers and queuing theory. A childhood of camping and model trains, it was, in Berners-Lee’s view, idyllic.

Berners-Lee had the good fortune to be supported by a series of teachers and managers who recognized his potential and unique way of working. He studied physics at the University of Oxford (his tutor “going with the flow” of Berners-Lee’s unconventional notation and ability to approach problems from oblique angles) and built his own computer. After graduating, he married and, following a couple of jobs, took a six-month placement at the CERN particle-physics lab in Geneva in 1985.

This placement set “a seed that sprouted into a tool that shook up the world”. Berners-Lee saw how difficult it was to share information stored in different languages in incompatible computer systems and how, in contrast, information flowed easily when researchers met over coffee, connected semi-randomly and talked. While at CERN, he therefore wrote a rough prototype for a program to link information in a type of web rather than a structured hierarchy.

Back at CERN, Tim Berners-Lee developed his vision of a “universal portal” to information

The placement ended and the program was ignored, but four years later Berners-Lee was back at CERN. Now divorced and soon to remarry, he developed his vision of a “universal portal” to information. It proved to be the perfect time. All the tools necessary to achieve the Web – the Internet, address labelling of computers, network cables, data protocols, the hypertext language that allowed cross-referencing of text and links on the same computer – had already been developed by others.

Berners-Lee saw the need for a user-friendly interface, using hypertext that could link to information on other computers across the world. His excitement was “uncontainable”, and according to his line manager “few of us if any could understand what he was talking about”. But Berners-Lee’s managers supported him and freed his time away from his actual job to become the world’s first web developer.

Having a vision was one thing, but getting others to share it was another. People at CERN only really started to use the Web properly once the lab’s internal phone book was made available on it. As a student at the time, I can confirm that it was much, much easier to use the Web than log on to CERN’s clunky IBM mainframe, where phone numbers had previously been stored.

Wider adoption relied on a set of volunteer developers, working with open-source software, to make browsers and platforms that were attractive and easy to use. CERN agreed to donate the intellectual property for web software to the public domain, which helped. But the path to today’s Web was not smooth: standards risked diverging and companies wanted to build applications that hindered information sharing.

Feeling that “the Web was outgrowing my institution” and “would be a distraction” to a lab whose core mission was physics, Berners-Lee moved to the Massachusetts Institute of Technology in 1994. There he founded the World Wide Web Consortium (W3C) to ensure consistent, accessible standards were followed by everyone as the Web developed into a global enterprise. The progression sounds straightforward although earlier accounts, such as James Gillies and Robert Caillau’s 2000 book How the Web Was Born, imply some rivalry between institutions that is glossed over here.

Initially inclined to advise people to share good things and not search for bad things, Berners-Lee had reckoned without the insidious power of “manipulative and coercive” algorithms on social networks

The rest is history, but not quite the history that Berners-Lee had in mind. By 1995 big business had discovered the possibilities of the Web to maximize influence and profit. Initially inclined to advise people to share good things and not search for bad things, Berners-Lee had reckoned without the insidious power of “manipulative and coercive” algorithms on social networks. Collaborative sites like Wikipedia are closer to his vision of an ideal Web; an emergent good arising from individual empowerment. The flip side of human nature seems to come as a surprise.

The rest of the book brings us up to date with Berners-Lee’s concerns (data, privacy, misuse of AI, toxic online culture), his hopes (the good use of AI), a third marriage and his move into a data-handling business. There are some big awards and an impressive amount of name dropping; he is excited by Order of Merit lunches with the Queen and by sitting next to Paul McCartney’s family at the opening ceremony to the London Olympics in 2012. A flick through the index reveals names ranging from Al Gore and Bono to Lucien Freud. These are not your average computing technology circles.

There are brief character studies to illustrate some of the main players, but don’t expect much insight into their lives. This goes for Berners-Lee too, who doesn’t step back to particularly reflect on those around him, or indeed his own motives beyond that vision of a Web for all enabling the best of humankind. He is firmly future focused.

Still, there is no-one more qualified to describe what the Web was intended for, its core philosophy, and what caused it to develop to where it is today. You’ll enjoy the book whether you want an insight into the inner workings that make your web browsing possible, relive old and forgotten browser names, or see how big tech wants to monetize and monopolize your online time. It is an easy read from an important voice.

The book ends with a passionate statement for what the future could be, with businesses and individuals working together to switch the Web from “the attention economy to the intention economy”. It’s a future where users are no longer distracted by social media and manipulated by attention-grabbing algorithms; instead, computers and services do what users want them to do, with the information that users want them to have.

Berners-Lee is still optimistic, still an incurable idealist, still driven by vision. And perhaps still a little naïve too in believing that everyone’s values will align this time.

  • 2025 Macmillan 400pp £25.00/$30.00hb

The post Tim Berners-Lee: why the inventor of the Web is ‘optimistic, idealistic and perhaps a little naïve’ appeared first on Physics World.

  •  

Influential theoretical physicist and Nobel laureate Chen-Ning Yang dies aged 103

The Chinese particle physicist Chen-Ning Yang died on 18 October at the age of 103. Yang shared half of the 1957 Nobel Prize for Physics with Tsung-Dao Lee for their theoretical work that overturned the notion that parity is conserved in the weak force – one of the four fundamental forces of nature.

Born on 22 September 1922 in Hefei, China, Yang competed a BSc at the National Southwest Associated University in Kunming in 1942. After finishing an MSc in statistical physics at Tsinghua University two years later, in 1945 he moved to the University of Chicago in the US as part of a government-sponsored programme. He received his PhD in physics in 1948 working under the guidance of Edward Teller.

In 1949 Yang moved to the Institute for Advanced Study in Princeton, where he made pioneering contributions to quantum field theory, working together with Robert Mills. In 1953 they proposed the Yang-Mills theory, which became a cornerstone of the Standard Model of particle physics.

The ‘Wu experiment’

It was also at Princeton where Yang began a fruitful collaboration with Lee, who died last year aged 97. Their work on parity – a property of elementary particles that expresses their behaviour upon reflection in a mirror – led to the duo winning the Nobel prize.

In the early 1950s, physicists had been puzzled by the decays of two subatomic particles, known as tau and theta, which are identical except that the tau decays into three pions with a net parity of -1, while a theta particle decays into two pions with a net parity of +1.

There were two possible explanations: either the tau and theta are different particles or that parity in the weak interaction is not conserved with Yang and Lee proposing various ways to test their ideas (Phys. Rev. 104 254).

This “parity violation” was later proved experimentally by, among others, Chien-Shiung Wu at Columbia University. She carried out an experiment based on the radioactive decay of unstable cobalt-60 nuclei into nickel-60 – what became known as the “Wu experiment”. For their work, Yang, who was 35 at the time, shared the 1957 Nobel Prize for Physics with Lee.

Influential physicist

In 1965 Yang moved to Stony Brook University, becoming the first director of the newly founded Institute for Theoretical Physics, which is now known as the C N Yang Institute for Theoretical Physics. During this time he also contributed to advancing science and education in China, setting up the Committee on Educational Exchange with China – a programme that has sponsored some 100 Chinese scholars to study in the US.

In 1997, Yang returned to Beijing where he became an honorary director of the Centre for Advanced Study at Tsinghua University. He then retired from Stony Brook in 1999, becoming a professor at Tsinghua University. During his time in the US, Yang obtained US citizenship, but renounced it in 2015.

More recently, Yang was involved in debates over whether China should build the Circular Electron Positron Collider (CEPC) – a huge 100 km circumference underground collider that would study the Higgs boson in unprecedented detail and be a successor to CERN’s Large Hadron Collider. Yang took a sceptical view calling it “inappropriate” for a developing country that is still struggling with “more acute issues like economic development and environment protection”.

Yang also expressed concern that the science performed on the CEPC is just “guess” work and without guaranteed results. “I am not against the future of high-energy physics, but the timing is really bad for China to build such a super collider,” he noted in 2016. “Even if they see something with the machine, it’s not going to benefit the life of Chinese people any sooner.”

Lasting legacy

As well as the Nobel prize, Yang won many other awards such as the US National Medal of Science in 1986, the Einstein Medal in 1995, which is presented by the Albert Einstein Society in Bern, and the American Physical Society’s Lars Onsager Prize in 1990.

“The world has lost one of the most influential physicists of the modern era,” noted Stony Brook president Andrea Goldsmith in a statement. “His legacy will continue through his transformational impact on the field of physics and through the many colleagues and students influenced by his teaching, scholarship and mentorship.”

The post Influential theoretical physicist and Nobel laureate Chen-Ning Yang dies aged 103 appeared first on Physics World.

  •