AI, Satellites and Stone Quarries: Rethinking State Liability for Wrongful Closure Orders

AI, Satellites and Stone Quarries: Rethinking State Liability for Wrongful Closure Orders

Introduction

What is Stone quarrying? It simply means digging stones from the ground for construction. In India, you can’t just start a quarry anywhere. It is controlled by the State governments  through mining departments. They provide leases and environmental clearances to make sure the natural resources are protected.

In the past, it was not only time consuming, but also economically troublesome to check if quarries were following the rules. Often, illegal quarrying went unnoticed. Now, satellite imagery, along with AI tools have made the job easier, for regulators who rely completely on remote sensing. AI helps the mining officers by making a list of suspicious spots, when satellite photos show where the digging is taking place.

But at various places, this can harm legal operators. For example, AI looks at a captured photo of a legal stone quarry by satellite and mistakes it for illegal quarrying due to shadows of road traffic. Therefore, marking it as a suspicious spot. The mining officer sees this and immediately orders the site to be shut without visiting it.The owner and the workers suffer financial losses. Later it turns out to be an AI error.

This blog asks a simple question: When a computer mistake like this causes a government shutdown of a legal quarry, who is liable? The government? Or the private company that made the AI tool? The analysis shows that both have roles.

How AI and Satellite Images Enter the Shutdown Process

First, let’s understand how satellite images and AI actually enter the decision‑making chain. The process usually starts with satellite imagery.

Satellite imagery are pictures of the earth taken at regular intervals from satellites. In India, such images are regulated under national remote sensing policies and supplied through agencies like ISRO and NRSC. For quarry areas, instead of a human officer moving around with a notebook, satellites show where land has been dug, how large pits have become, etc. This covers a large area in less time. Apart from this, the State or a private company builds an AI‑based tool. The AI compares new images with older ones and flags places where it thinks fresh excavation has happened or boundaries have been crossed. These flagged locations are sent to mining officials, who further either schedule an inspection or directly issue a shutdown notice.

It is important to see that the quarry operator is never part of this technical chain. The operator neither sees the satellite images, nor does he approve the AI model. It usually does not even know how the flags are generated. Therefore, when the AI makes a mistake and the State acts on it, the legal question is how far the State can rely on such tools without proper checks and what duties it has.

Who Owns and Controls the AI Outputs?

This is the most crucial question because if the State cannot explain the AI output it relied on, quarry operators cannot challenge it effectively in court or appeals.

The three main players:

  1. State mining department – the government body that issues the orders.
  2. Private AI vendor – the company that builds and runs the satellite and AI tool.
  3. Quarry operator – the licensed business that receives the notice.

The two main layers of data:

  1. Raw satellite images – pictures controlled by government agencies. These are not freely available and are only licensed to States and vendors.
  2. AI outputs –  These are the flags created when the raw images are processed.

The basic ownership arrangements:

  1. State owns everything: The contract says all AI maps and models created for this project belong to the government. This gives the State full control to explain the flags in court.
  2. Vendor owns the model: The private company keeps intellectual property rights to its AI software. This gives the State use rights only and not owner rights.
  3. Mixed rights: Vendor owns the general AI model, while the State owns the quarry-specific outputs.

Why ownership matters: Good contracts require audit rights and explainability. This ownership gap feeds directly into liability – if AI is wrong, can the State recover costs from the vendor, or does it eat the loss alone?

Liability When AI Causes Wrongful Quarry Shutdowns

When AI wrongly flags a legal quarry and the State shuts it down, who pays for the losses?

  • Public-Law Liability: The State Answers First

In law, the quarry owner sues the State mining department, because the shutdown notice is a government order under mining laws.

On being challenged, the questions asked are: Was there a proper inquiry? Was it arbitrary? Did it violate natural justice? Possible remedies are: The Court quashes the order, awards compensation and orders mandatory site checks for the future.

Courts hold the decision-maker accountable, even if tools helped make the decision.

  • Contractual Liability: State Makes Vendor Pay Back

The State-vendor contract creates a second layer of accountability. This is the agreement, like any other business deal,  where the vendor agrees to bear the losses that hurt the state due to their AI model. The contracts set accuracy rules or performance standards. This forces the company to keep improving their model instead of delivering ‘good enough’ tech. Contracts include payback clauses which allow the state to ask for their money during legal issues. It is called ‘Indemnity for negligence’. This ensures transparency.

These clauses don’t give the quarry owner a direct lawsuit against the vendor. But, now this motivates the vendor to fix the bugs and disclose if any limits exist. It also prepares the state to back up with a recovery plan if things go wrong.

  • The Right Balance

The ideal system keeps remedies simple. The operators cannot track down private companies and then file complicated suits. They directly challenge the government’s order in the court. Therefore, upfront, it is the public authority that faces scrutiny and bears consequences. But vendors face responsibility through those State contracts. This split protects small businesses while pressuring vendors to deliver reliable systems, turning potential AI disasters into a fairer monitoring framework for everyone.

Conclusion

When an AI system misreads a legal quarry as an illegal site and the State reacts by shutting it down, the consequences harm the  operators and workers who have followed the rules. It is signed, issued and enforced by public officials and it is the State that stands before the court when a quarry owner challenges it. Technology may act in the background, but it cannot replace the duty to make a proper inquiry, avoid arbitrary decisions and respect basic fairness. But, it would be unfair to let private AI vendors walk away from repeated, preventable technical errors. Hence, careful contractual design helps the State to recover its own losses and forces vendors to keep improving the tools they sell.


Author Name- Suhani Agrawal, a first year BALLB(Hons.) student at National Law Institute University, Bhopal

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *