The ill-conceived US war on Iran is leading the world to a global recession or depression. Professor Richard Wolff places the current crisis in the context of the capitalist system and the decline of US hegemonic power.
Parental responsibility to be restricted for perpetrators of worst child sex offences
Victims given longer to challenge sentences they believe are too lenient.
Measures to support victims in speaking out against crime and punish criminals who refuse to attend sentencing hearings – to deliver fairer justice for all.
Victims of crime will benefit from a series of crucial measures to ensure they get the support, protection and dignity they deserve, as the Victims and Courts Bill becomes law.
The Victims and Courts Bill has received Royal Assent – marking an important step towards building a justice system that works better for victims.
For the first time ever, the new law will allow judges to hand down prison punishments for cowardly criminals who refuse to attend their sentencing hearing. The measures will also protect innocent children by restricting the exercise of parental responsibility by offenders sentenced for serious child sexual abuse, and where a rape has resulted in the birth of a child.
The new legislation will also ensure that victims are not prevented from speaking out against crime and have longer than ever to challenge sentences they don’t feel fit the crime they’ve suffered.
The move is a vital part of the Government’s Plan for Change – helping to deliver justice for all victims and restore faith in the justice system.
Deputy Prime Minister and Justice Secretary David Lammy said:
For too long, too many victims have been left to navigate a justice system where they often felt like an afterthought. Today, that changes.
The new Victims and Courts Act will make sure that victims’ needs, voices and rights are at the heart of our justice system.
It is an Act for survivors, for bereaved families and for everyone who has fought to be heard.
The new legislation will include a series of important measures for victims and bereaved families. These include, but are not limited to:
Introducing a court order requiring criminals to attend their sentencing hearings or face sanctions if they don’t – so that victims and bereaved family members see offenders face the full consequences of their actions;
Protecting the children and victims of vile sexual predators by restricting the exercise of their parental responsibility where the child is born from rape, or when they are convicted of serious child sexual abuse with a sentence of four years or more;
Giving victims and their families more time than ever to refer a sentence under the Unduly Lenient Sentence scheme if they don’t feel it fits the crime they’ve suffered – from the current 28 days to up to six months;
Ensuring that non-disclosure agreements (NDAs) cannot prevent victims from speaking out about criminal conduct to anyone for any reason;
Strengthening powers for the Victims’ Commissioner to better hold the justice system to account and ensure victims receive the support they need.
Victims and Tackling Violence Against Women and Girls Alex Davies-Jones said:
For every victim who has felt let down by the justice system, this legislation sends a powerful message: you matter, your voice matters, and your rights matter.
I would like to thank the remarkable families of Jan Mustafa, Olivia Pratt-Korbel, Sabina Nessa and Zara Aleena who have campaigned tirelessly to ensure offenders attend their sentencing so they can finally confront the reality of their crimes.
I would also like to thank Tracey Hanson and Katie Brett whose incredible work will help ensure other families don’t experience the injustices they faced by not being able to challenge sentences they felt did not fit the crime.
From today, we’re putting victims’ interests at the heart of the justice system, to protect them, support them, and deliver the fairer justice they deserve.
The new Victims and Courts Act will also give victims confidence about the routes available to receive information about their offender’s release, by updating post-conviction communication schemes.
It will also remove outdated restrictions on who the Crown Prosecution Service (CPS) can appoint as Crown Prosecutors, allowing a larger pool of suitably qualified candidates to take on the role.
The new Act builds on wider action the government has introduced to support victims and ensure they get the fast and fair justice they deserve. This includes:
The barrier to starting a business isn’t capital anymore. It isn’t a team. It’s just you, Axios’ Jim VandeHei writes.
The old rule: Before you could launch, you’d need to assemble an army — lawyer, accountant, developer, designer, copywriter, researcher. This cost, and complexity, left a million ideas unborn.
The new rule: Anyone with a strong idea and solid AI prompting skills can model and prep a new business in a weekend.
Why it matters: This is THEunderappreciated upside of the same AI boom that many fear will eviscerate existing jobs. It’s never been this easy to start something with so little capital.
The startup boom appears to be underway. 580,612 new businesses formed in March 2026, according to Registered Agents Inc.’s monthly Business Formation Report — a 14% year-over-year jump.
The share of solo-founded startups climbed from 23.7% in 2019 to 36.3% by mid-2025, per Carta.
When we started Politico in 2007 and Axios in 2017, it took months to sketch out a website, mock up designs and scrub legal obstacles to our names and business.
We could do in hours now what would have taken weeks. You can, too.
The new math, function by function:
Legal scaffolding. Describe your setup to Claude or ChatGPT and get a plain-English LLC vs. S-Corp breakdown, a filing checklist and a draft operating agreement. You fact-check and fine-tune instead of paying a lawyer.
Market research. Paste in the concept. Ask for the steelman case(strongest argument) against it,the existing players, the pricing, the complaints. Build a customer survey in an afternoon. You talk it through with the AI instead of hiring a generalist researcher or firm.
Financial model. Outline your business case out loud: How will you make money and spend it? AI builds the spreadsheets and forecasts. Stress-test and iterate live with the LLM: “What assumptions are weakest? What threats or opportunities am I missing?” That’s two weeks of junior analyst work done before lunch.
Brand and copy. Tell the LLM about your customer — not the demographic, the person — and generate a logo, a design, a homepage, email sequence and pitch. All that used to cost a lot of money and take six weeks. Now, it requires your judgment over an hour.
Product. Sketch out features and get working prototypes without writing code. Iterate even more simply with voice or text commands. You can easily change how it looks or works in minutes. If your business is more physical than digital, AI drafts your supplier outreach and negotiation scripts. Service? Positioning, packaging and pricing — all built by you and your AI.
Support, onboarding, docs. Done before launch with a near-zero marginal cost.The “what’s left?” test: We’ve reached the decisions where the value of being human is everything. What remains are the parts that were always hard — the parts no tool can replicate.
Judgment: Knowing what’s worth building.
Taste: The ability to discern “good enough” from “market-ready.”
Trust: The human-to-human connection that survives the bot-generated marketing.Resilience: The grit to keep going when v1 doesn’t land. Startups are fun as hell, but still hard.
The bottom line: For 30 years, the excuse for not starting was the cost of starting. Consider that excuse expired.Share this story.
Worth watching … theMITmonk: How I’d Build a 1-Person AI Business (0 to $1M+) … Dan Martell: How to Build a $10M Solo AI Business … Anik Singal: How To Build a One-Person Solo Business Using AI.If you start a business based on this column (and we hope you do!), shoot Jim a note: finishline@axios.com.If you’re a CEO or on a CEO’s team: Ask to join Jim’s new weekly Axios C-Suite newsletter.
What do Ukraine’s robot soldiers mean for the future of warfare?
Remote-controlled weapons have been used for some time, but AI is now on the cusp of making battlefield decisions. Listen (8 mins)
Save
Click here to share on social media
Share
Add Al Jazeera on Google
Attendees watch a robot on the second day of the Road to URC – Security and Defense Dimension conference at the Rzeszow University of Technology in Rzeszow, Poland, on April 27, 2026 [Wojtek Jargilo/EPA]
In a scene reminiscent of a computer war game, three battle-fatigued soldiers, dressed in white snow camouflage, emerge from a war-torn alley with their hands raised above their heads.
They crouch down, following the orders being blasted at them, fear and shock etched across their faces as they stare down the barrel of a machinegun mounted on a so-called ground robot.
This footage, released in January by Ukrainian defence company DevDroid, is said to show the moment Russian soldiers were captured by a Ukrainian robot using artificial intelligence.
In April, Ukrainian President Volodymyr Zelenskyy said that, for the “first time in the history of this war, an enemy position was taken exclusively by unmanned platforms – ground systems and drones”.
“Ground robotic systems have already carried out more than 22,000 missions on the front in just three months,” he wrote in a post on X, alongside images of green machines with tank tracks and weapons mounted on top.
But for analysts who have studied the intersection of artificial intelligence (AI) and warfare, the footage reflects an expected evolution – one that will unfold far beyond the front lines in Ukraine as the world wrestles with the ethical implications of controlling it.
UAVs, naval drones and robot dogs
For years, militaries have used ground robots primarily for bomb disposal and reconnaissance.
But in Ukraine, their role has expanded rapidly, with some brigades reporting that up to 70 percent of front-line supplies are now delivered by robotic systems rather than soldiers.
These machines transport ammunition, food and medical supplies, and evacuate wounded troops from dangerous positions.
Get instant alerts and updates based on your interests. Be the first to know when big stories happen.Yes, keep me updated
Yet the sight of robotic systems moving across the battlefield is part of a much broader shift in warfare – one that has been building for decades.
The modern debate about AI in warfare was largely driven by the rise of US unmanned aerial vehicle (UAV) operations in the early 2000s.
In 2002, the MQ-1 Predator drone was used by the US to carry out one of the first targeted air strikes in Afghanistan, marking a turning point in how wars could be fought remotely.
Its use expanded rapidly throughout the 2000s and peaked in the late 2000s to mid-2010s, particularly in Pakistan, Yemen and Somalia.
As AI has advanced, the debate has moved beyond remote-control operations.
The focus shifted towards systems which can help identify targets, prioritise strikes and guide battlefield decisions, raising deeper questions about how much autonomy should be delegated to machines.
Analysts say the question of autonomy must remain central, rather than being overshadowed by rapid technological developments, however striking the sight of increasingly anthropomorphic machines on the battlefield may be.
“These technologies are here to stay,” Toby Walsh, an AI expert at the University of New South Wales, told Al Jazeera. He described AI-driven military operations as “the third revolution of warfare”.
The transformation is also spreading beyond land targets.
Naval drones packed with explosives have already reshaped battles in the Black Sea, while autonomous underwater systems are being developed for surveillance, mine clearance and sabotage missions by militaries worldwide.
Robotic dogs, meanwhile, are already being tested for surveillance, reconnaissance and bomb-disposal missions, with some experimental versions even fitted with weapons.
Human involvement
In recent years, the emergence of fully autonomous drones or so-called “killer robots” has triggered a fierce debate after a United Nations report suggested that Turkish-made Kargu-2 loitering munition drones, operating in fully autonomous mode, had identified and attacked fighters in Libya in 2020.
The incident prompted intense discussions among experts, activists and diplomats worldwide, as they grappled with the moral and ethical implications of a machine making – and executing – the decision to take a human life.
However, there needs to be more focus on regulatory debate about the use of semi-autonomous weapon systems, “where humans are still so-called in the loop”, Anna Nadibaidze, a postdoctoral researcher in international politics at the Centre for War Studies, University of Southern Denmark, told Al Jazeera.
A major concern, she said, is whether “enough time and space” is being given to the “exercise of human judgement that’s necessary in the context of warfare”.
The extent of human involvement is often something observers have to take militaries at their word on; a difficult task when their actions leave trust in short supply, said Toby Walsh.
In the case of ground robotics in Ukraine, a human operator has, so far, remained in control, directing machines that can still be halted by obstacles such as uneven terrain.
However, when AI is involved in the decision-making process, as is the case in Israel’s attacks on Gaza and the wider region, the scale of attacks which have resulted in “huge collateral damage and civilian casualties for a small number of military targets” challenges the rules of international humanitarian law and, in particular, the idea of proportionality, Walsh said.
The issue, Nadibaidze said, is that it is hard to enforce rules on the use of AI in warfare as it is essentially “a matter of each military to decide what they consider to be a citizenship role for the human, and there isn’t enough international debate on that”.
An April report by the Stockholm International Peace Research Institute warned that the AI supply chain is also fragmented, global and heavily dependent on civilian technologies, further complicating efforts to govern or control military uses of AI.
The United States Department of Defense and the Pentagon are consistently incorporating privately developed software systems into their war apparatus.
In the middle of last year, the Defense Department awarded OpenAI a $200m contract to implement generative AI into the US military, alongside $200m contracts for xAI and Anthropic.
“If we’re not careful, warfare will be much more terrible, much more deadly, a much quicker, much faster thing that humans can no longer actually really be participants in, because humans won’t have the speed, won’t have the accuracy or the ability to respond,” Walsh warned.
How AI is being used to target Palestinians
Ukraine as a testing ground
Technology and AI are not inherently harmful, experts say – it is how they are used that matters.
In Ukraine, ground robotic systems have also been used to rescue civilians and provide logistical support in heavily mined and treacherous conditions.
Yet what is unfolding on the front line is, in many ways, a testing ground, and the international community will need to look ahead to how these technologies might be applied and regulated in future conflicts.
There is also room for cautious optimism. Despite the “moral failure” over Israel’s actions in Gaza, Walsh said, there is a recognition in the international community that these issues must be addressed, including a series of UN meetings focused on regulating Lethal Autonomous Weapons Systems.
The United Nations Institute for Disarmament Research (UNIDIR), an autonomous body within the UN which conducts independent research on disarmament and international security, is set to meet in June to examine the implications of AI for international peace and security.
It is not the first time new weapons technologies have threatened to upend the rules-based order, said Walsh, pointing to chemical weapons as an example. While imperfect, international agreements were eventually put in place to bring those under some level of control.
“There are a lot of actors based in the Global South that do want regulation, so there might be regional initiatives forming,” said Nadibaidze, adding that even if such efforts do not initially include major powers or leading tech developers, they could still help to shape emerging norms.
The LEGO-style videos making fun of Donald Trump over his war in Iran have racked up millions of views on social media. FRANCE 24’s Tehran correspondent spoke to the content creator blurring the line between entertainment and propaganda.
Oil prices reached their highest levels since the Iran war began, with Brent crude topping $126 per barrel this morning before pulling back, Axios’ Ben Geman reports.
The jump will keep pushing U.S. gasoline prices higher amid the possibility of a long stalemate that keeps the Strait of Hormuz throttled.️
The nationwide average price for a gallon of regular gas is $4.30 as of this morning.
That’s up 7¢ from yesterday, and up nearly 30¢ in a week and over a dollar more than a year ago, per AAA.Interactive map.
Apr 29, 2026When Ian Bremmer mentioned that the US now scrubs 5 years of social media for anyone entering the country, Trevor Noah’s reaction said everything. “If China did this tomorrow, people would never go to China.” A simple fact, stated plainly, that challenges everything you think you know about American values. Full episode with Ian Bremmer now on What Now? with Trevor Noah — watch here on YouTube.