The issue that economists face is that this standard theory does not work for intangible objects – which can be reproduced at scale without costs. If I were to sell ten computer chips, I would have to have enough raw materials and labour to create ten. If I sold a dataset, it could be copied and resold without effort.”
Dr Pierpalo Vivo
05 November 2024
New framework uses games of chance to put 'price' on intangible assets
The new statistical model could have major implications for how data is bought and sold.
A new statistical model could help to address the age-old question of how to price non-physical, intangible goods like data, say scientists.
In a new study published in Physical Review Research, King’s mathematicians used a statistical physics framework using games of chance to evaluate the potential monetary value of these assets. They hope this might help lay the foundation for how companies dealing in data can fairly price their products.
Traditionally, the standard economic theory of how to price goods fails with non-physical objects like intellectual property and data.
As Dr Pierpaolo Vivo, author of the paper alongside Dr Alessia Annibale and PhD student Luca Gamberi, explains: “The general theory of supply and demand for objects balances how much of the item is available for purchase with how willing people are to buy it. Factoring in the cost of raw materials and labour to produce the object, you can determine a fair price”.
“The issue that economists face is that this standard theory does not work for intangible objects – which can be reproduced at scale without costs. If I were to sell ten computer chips, I would have to have enough raw materials and labour to create ten. If I sold a dataset, it could be copied and resold without effort, plus there will be no ‘wear and tear’ to bring the value down.”
This is a common problem for shops which offer discounts through programmes like supermarket loyalty cards, where price reductions on goods are pegged to how much the personal data they collect from individual card holders will be worth.
Determining the price for intangible assets is further complicated when the sale could directly affect the seller’s market advantage – such as when a data company sells information on what could be a good investment in the stock market to others.
Without a benchmark, it is difficult to predict that data’s monetary worth when it could both create a competitor and remove the seller’s ‘edge’. In response, many data companies or data valuation providers either develop narrow systems to ground the prices of niche types of data or provide a rule-of-thumb approach to satisfy customers but have little to do with the fundamental mechanisms that affect the price of the data.
To address this, Dr Vivo and team analysed the problem in the context of games of chance, such as poker and roulette – where there is an underlying random process, but some players may have extra information they use to their advantage.
Ultimately, this is the first step in what we hope to be a general theory for how all data should be priced and get it to function just like the materials which shaped our understanding of money in the world.”
Dr Pierpalo Vivo
“Players in a rigged game of roulette may suspect the dealer is not being fair, but unless they’ve recorded the outcomes of throws over a long period of time, they can’t use that information. If one player has it, they can build a mental map of how this bias might impact where the ball will fall, and bet on the most likely outcomes.
“If that player gives this information to the other competitors, their chance of winning goes down and they should be compensated for the loss.
“Using this idea, we have essentially been able to mathematically compute the fair price that these uninformed players should pay to offset the loss of edge the seller, before they sell their data.”
This scenario is known as ‘informational asymmetry’ – where not all the players are ignorant to the game being rigged and this information can be traded between the players to make the system fairer, or more advantageous to players in the know.
The application of this model is currently theoretical, but the roulette game can be used as an analogy for the financial market. The researchers suggest that the framework could determine the correct “equilibrium point” where sharing proprietary information creates a stronger competitor than yourself and could provide appropriate price scaling to compensate for lost competitive value.
“Ultimately, this is the first step in what we hope to be a general theory for how all data should be priced and get it to function just like the materials which shaped our understanding of money in the world.”