糖果派对

Skip to main content
DA / EN
Technology

Artificial intelligence and property valuation: When the public faces an opaque system

Lack of transparency and traceability in the new property valuation system makes it hard for people to understand and challenge decisions.

By Associate Professor Jøren Ullits, Department of Law, University of Southern Denmark

The Danish Tax Agency is rolling out a fully automated valuation system to set the tax base for Danish properties. It’s “fully automated” in the sense that decisions are made without human involvement in case processing or decision-making, at least in theory.

It’s commonly called the “property valuation system,” but that’s misleading. It’s not just one system but a whole web of interconnected systems and machine learning models, built using artificial intelligence.

When the first wave of home valuations (for 2020) was published, it sparked heavy public criticism, partly over the lack of transparency in the decisions. Many homeowners struggled to understand how the Tax Agency had arrived at their valuation, even though the documents explaining the method stretched across dozens of pages. The decision letters are about 22 pages long.

One homeowner, trying to check the legality of his property valuation, requested access to the data used for his 2020 valuation. The agency’s reply made it clear that only a fraction of the automated data processing – labelled as “data manipulations” in the decision – was actually stored. Granting the request would require reconstructing the valuation, which, according to the agency, might not even be possible, since the “current codebase and data structure had changed significantly” since the homeowner’s 2020 valuation was made.

This answer might raise concerns that the Tax Agency has used different system versions within the same valuation period. This would be worrying, because when a system is modified (whether through upgrades or tweaks to its calculation model after it’s launched and used to make decisions), it can amount to a change in practice if the basis for valuation shifts. A basic principle in automated administration is equal treatment: automated decisions on the same type of case – say, valuations of owner-occupied homes for 2020 – should be based on the same calculation model. If the authority decides to change its approach, it must announce the shift.

For a decision to be legally checked, the authority’s actions must be traceable. If the agency only stores a limited subset of the calculated values during the valuation process and the valuation can’t be reconstructed later due to system changes, oversight would seem to be difficult to achieve. As a former US Supreme Court Justice once put it: “Sunlight is said to be the best of disinfectants.” In another case, a homeowner requested access to the source code of the valuation system, hoping to understand how his valuation had been determined. The source code contains the instructions and logic that define how the system operates, what functions it performs and how it interacts with users and data. The Tax Agency denied the request. The homeowner appealed to the Danish Tax Appeals Agency, which also rejected the request.

Rules on access to information allow people to request official documents that might help them understand the reasoning behind a decision. But these rules only cover documents and the information within them, because official records must be kept and structured according to archiving and documentation rules. Information about the calculation model used in an automated decision-making system and its source code doesn’t fit neatly into this framework. That makes it easy for the tax authority to reject requests for access to such material.

Denmark could follow France’s lead by introducing a right to access the source code of fully automated systems. That wouldn’t necessarily help individual citizens understand why their own case was decided the way it was, but it would allow for public “grey box testing”, i.e., a critical investigation and troubleshooting. In France, access to the source code of a controversial university admissions system revealed that when applications outnumbered available places, the system assigned spots at random – even though education law required university heads to provide reasons for each decision. In effect, places were awarded by rolling digital dice. The built-in allocation rule was illegal.

Giving access to the source code of systems like the property valuation system would be a symptom fix, helping citizens who can’t otherwise understand why a decision was made and who feel forced to find answers through indirect means. It probably wouldn’t be necessary if the agency had kept in mind when designing the system that the decisions made by an automated system must be clear and comprehensible to the average citizen.

 

Editing was completed: 13.03.2025