There's a difference between the intent of regulation and the effects in the real world
We have another of these lovely examples of how the intent of a regulation can be very different indeed from the effect of said regulation out there in the real world. We'll assume that most people are pretty cool with there being regulations against murder and punishments for breaching them. We're also pretty sure that such regulations and punishments reduce the number of murders that occur. So, sure, some regulations can indeed be beneficial, achieve their stated goal. We can also look around the world and see those gurning idiots in South America who think that if you peg the price of toilet paper nice and low then the poor will be able to afford toilet paper. Of course, what happens is that no one can afford toilet paper as no one is willing to make it for this new and lower price. Regulations can have the opposite effect to that intended.
And then there's, well, then there's this:
Fair or not, this latest evidence of the risks of informal surrogacy arrangements, in the context of Britain’s strict regulatory code, can only encourage more parents to bypass local options and head straight for a poorer or developing country. In India, for example, surrogates are plentiful, screened and by all accounts more dependable than British volunteers.
Leave aside, for a moment, any judgement on either the morality or desirability of such surrogacy. And consider the statement there. That strict regulation of who may do what and when drives the very activity itself out of the regulatory net. Does this regulation therefore achieve its aim? We would say probably not. The take away from this specific example being that, if one wanted to keep the activity inside the regulatory net then one would probably argue for a lighter touch with the regulation.
This observation is of a great deal mpore use than just talking about reproductive technology of course. It's from the one side, the argument used in favour of legal abortion: without the legality it would still take place on those fabled backstreets and this would be worse. And it, from the other side, informs our attitude towards recreational drugs. As is obvious it's going to happen anyway. So, loosen the regulations on whether people can or not so as to bring the activity into the regulations on purity and safety. Which is, as should be obvious, exactly the same as that abortion argument. Both are arguing that regulation should be pitched at the level to minimise harm, that only being possible when regulation is sufficiently light for the activity to remain regulated at all.
Thus it is essential that all regulation be "light touch" regulation. Within a wide and highly variable definition of "light" to be sure, dependent upon the specific activity. But it must always be light enough not to drive the activity underground and thus out of the reach of any regulation at all.
Here's one reason for slow growth: regulation
Perhaps we do want to drill for oil in the heartland of the Home Counties, perhaps we don't. That's rather a political question and probably one best solved by politics. Instead, we want to draw attention to one of the reasons why the richer countries tend to have lower growth rates:
It is understood that although UKOG, which is chaired by Mr Lenigas, and its partners have local planning permission and a licence for the site in Surrey they would still require formal approval from the newly established OGA before any flow testing operations- vital to the project’s commercial future - could begin.
UKOG, whose shares are traded on the London Stock Exchange’s junior Aim market, said in a statement to the stock exchange that planning permission to proceed at Horse Hill was in place and that it had “already submitted the applications to the authorities for their consent” to conduct a flow test of the well.
However, The Telegraph understands that no such application had been received by the Oil and Gas Authority as at the close of business today, Aoril 28.
A spokesman for Mr Lenigas – who is playing an active role in driving the scheme forward – said that the company stands by its original statement “which is entirely accurate”.
According to guidelines, a well test would require planning permission from the Mineral Planning Authority, environmental permits from the Environment Agency (EA), a review of well plans by the Health and Safety Executive and finally regulatory consent from the OGA.
This process could take months, which if so would potentially put the company’s plans to flow test Horse Hill this year under pressure.
Leave aside, entirely, the interesting financial background to this. Consider instead this regulatory thicket.
It's entirely possible that this is environmentally sound. That it really should take many months for the authorities to decide whether someone should be allowed to carry out a flow test. We find that hard to believe, given that well tests are a normal, routine, and well understood part of the process. But let that stand: there's still, obviously, a restriction of the growth rate of the economy when people trying to do new things meet such regulatory thickets.
If there must be regulation then it needs to be simple, obvious and above all fast. The current regime doesn't meet those tests: and this is a part of the explanation as to why growth rates are slow. We've allowed the necessity of a certain amount of bureaucratic licence gaining to become the equivalent of a barnacle encrusted hull on a ship. This reduces the speed through the water: time to haul the system out for a good careening.
It really is planning that is the problem with housing and house prices
We know, we go on about this almost ad nauseam. But it really is true that the basic problem with housing and house prices is the planning system. An interesting paper from the CEPR allows us, once again and via a different route, to prove this:
How have house prices evolved over the long‐run? This paper presents annual house prices for 14 advanced economies since 1870. Based on extensive data collection, we show that real house prices stayed constant from the 19th to the mid‐20th century, but rose strongly during the second half of the 20th century. Land prices, not replacement costs, are the key to understanding the trajectory of house prices. Rising land prices explain about 80 percent of the global house price boom that has taken place since World War II. Higher land values have pushed up wealth‐to‐income ratios in recent decades.
It is not that houses have become more expensive to build. The standard 3 bedder suburban semi can be put up, from scratch, for £120k and a little less than that in volume. What has become more expensive is that land. And, no, it's not that we're running out of land nor even that land itself has become more expensive. Even prime agricultural land in he SE tops out at £10k a hectare.
It is that land upon which you are allowed to build a house has become more expensive. And that of course is an entirely artificial shortage caused by the planning system itself.
So, if we want to deal with the "housing crisis" what we need to do is reform the planning system. Probably to the one we had before it caused this particular problem which was, essentially, to have no planning system at all.
The terrible pollution from Chinese rare earth manufacturing
It's entirely true that socialist and communist mineral extraction methods are not quite as interestingly clean as we might like. It's also true that the technologies used in China for the disposal of the wastes from such processes are less clean or safe than we would accept in our own back yards. It is, however, possible to lose any sense of proportion about this. The BBC has a long piece about just that pollution:
You can see the lake on Google Maps, and that hints at the scale. Zoom in far enough and you can make out the dozens of pipes that line the shore. Unknown Fields’ Liam Young collected some samples of the waste and took it back to the UK to be tested. “The clay we collected from the toxic lake tested at around three times background radiation,” he later tells me.Unknown Fields has an unusual plan for the stuff. “We are using this radioactive clay to make a series of ceramic vessels modelled on traditional Ming vases,” Young explains, “each proportioned based on the amount of toxic waste produced by the rare earth minerals used in a particular tech gadget.” The idea is to illustrate the impact our consumer goods have on the environment, even when that environment might be unseen and thousands of miles away.
We admit, we've never really understood why every hippie is so fascinated by home made pottery. But they are correct in that rare earth mining does mean radioactive substances. Almost all such pores have thorium in them. As no one uses thorium to do very much that thorium is left in the wastes rather than extracted. And do note that none is created: what is being done is that extant radioactive metals are being pulled out of the earth in one place then dumped back onto the earth a few kilometres away. However, there's something rather more important here. What they're saying is true, there's radioactivity in that thar' lake. But is it an important amount?
Three times background? The difference between London and Cornwall. No, it's not an important amount.
Uber forms of governance
A few weeks ago Samuel Hammond posted some interesting thoughts on multi-sided platform (MSP) technologies like Uber and PayPal, and the role they play in providing forms of governance. You can read the whole thing here, but the outline is as follows: Governance—that is, things like rational planning to solve co-ordination problems, the setting of rules and assigning of rights, etc—is done not just by states but by private companies, too. Platform technologies (essentially those which enable interaction between different groups of people) are a good example of this. Amazon and Ebay, for example, are virtual marketplaces which connect disparate buyers and vendors from across the world, whilst PayPal is not only a payments processor but an arbiter of commercial disputes, with procedures to file a complaint and rules on when compensation is entitled.
'Good' governance occurs when the rules set and rights assigned are reasonable to both sides of a transaction, and when the total societal cost of any regulation is kept to a minimum.
State governance schemes are well-intentioned, but can be rather crude and inefficient in their execution. Take taxi-regulation for example: licenses guarantee a certain level of quality and safety for consumers, whilst regulated fares remove the cost and risk of having to haggle for every journey. At the same time, though, these regulations result in high prices, inflated barriers to entry, a lack of competition and little reason for incumbents to innovate or improve their service. Whilst the state tries to efficiently balance the costs regulation imposes on each party it often falls short, both because of the grand Economic Calculation Problem, and things like capture of the regulatory process by special interest groups.
In contrast, Hammond argues, MSP technologies are very good at being governance institutions. For example, Uber has ruffled so many feathers because its popular service is arguably a superior system of taxi regulation, thanks to its use of participant rating systems, safety features, an algorithmic pricing structure and so on. And, by controlling a bottleneck to market access, Uber to some extent acts as a de facto private licensing authority.
These 'market regulators' tend to be good at governance for a couple of reasons. One is that they're free to harness new technology and experiment with different setups, driving down transaction costs. Another is the fact that a rival platform may always come along an offer an alternative service— and, as Hammond notes, "the only way to supplant an incumbent platform is to adjust the governance structure in a way that social costs are better compensated by maximising the bargaining surplus". This ensures that even when a market regulator looks like a natural monopoly, so long as there is the possibility of exit, the cost of regulation will tend towards the social minimum.
Hammond argues that this means that MSP technologies are not just a bit better at governance than state institutions, but that "they potentially meet the economic definition of an ideal 'public interest' regulator". And, just as Uber challenges traditional taxi governance models, we can imagine a future where all property rentals are listed on something like Air BnB, with tenancy acts and local regulation displaced by market-set rules and regulations which efficiently balance the interests of renter and landlord.
Such a situation should perhaps not be understood as 'deregulation', but a shift in the act of governance from the state to the firm, resulting, we assume, in reduced costs to society as a whole.
I particularly liked this post because it complemented some thoughts (and assuaged some fears) I'd had about the state harnessing new technology and commercial consumer insights to better perform its functions. In November I wrote a rather gloomy piece on 'algorithmic regulation' and the government's use of things like 'big data' and behaviour prediction to create more efficient, streamlined, and reflexive regulation, which, like the google search algorithm, would be constantly reviewed and updated according to insights generated into 'what works'.
Such algorithmic regulation, I thought, could make government regulation more efficient, less irrational and less intrusive—but it could also open the doors to forms of dystopic technocracy. Once governments have the ability to create, access, and utilise vast swathes of information on their citizens, they're likely to want to expand their scope of operations. Perhaps they'll be tempted to 'connect the dots' between different types of lifestyles and tax income, health outcomes and the like. The opportunities for Nudge-on-crack policies could be everywhere. In addition, behind the seemingly apolitical goal of 'rule by algorithm' it's easy to smuggle in hidden political assumptions, and use questionable or untrue assumptions to dictate what our government supercomputers do. Nonetheless I felt I was being an unwarranted techno-pessimist, so I filed it away my wonderings before resurrecting them recently on my tumblr.
However, Hammond's post sketches out an alternative governance system I'm much more of a fan of. Instead of harnessing private sector insights and using them to aid an intrusive and bloated state, he imagines a situation where government effectively outsources certain functions to private bodies (Uber as a private licensing authority, etc). And, because these bodies have to respond to market pressures, if they do a bad job or overstep the mark parties can vote with their feet. These alternative governance systems are less likely to cater to special interests, and don't require the state to handle so many terabytes of personal data. We still have algorithmic regulation, but done more on the terms of the parties affected instead of the state.
To me, neither of these two futures of regulation seem implausible. But I certainly know which one I'd prefer.
Well done to The Guardian for joining the dots here
It is, of course, possible that we're all being mugged by the retailers and supermarkets. In the absence of sufficient market competition that's actually what we'd expect in fact, for capitalists are greedy for profit. That's why we like markets. It's also possible that the retailing business has intensive competition meaning that vast pressure is being put upon those suppliers and retailers to deliver the lowest possible prices to us, the consumers. Either story is possible but it's really most unlikely that both are true. So, well done to The Guardian here:
The competition regulator is to scrutinise allegations that UK supermarkets have duped shoppers out of hundreds of millions of pounds through misleading pricing tactics.Which? has lodged the first ever super-complaint against the grocery sector after compiling a dossier of “dodgy multi-buys, shrinking products and baffling sales offers” and sending it to the Competition and Markets Authority.
That's a version of the first story. That there's insufficient competition, we cannot take our trade elsewhere and so we're being rooked by the capitalists.
Meanwhile, new research suggests that more than 1,400 suppliers to Britain’s supermarkets are facing collapse as the cut-throat price war takes its toll on the industry.
The number of food and beverage makers in significant financial distress has nearly doubled to 1,414 in the last year, according to insolvency practitioner Begbies Traynor.
That's a version of the second story, that there's intensive competition to the benefit of us consumers. And it is entirely contradictory to the tale of the first story.
We really cannot have both happening in he same market at the same time. But according to The Guardian we do because both examples are from the same article. Without their being able to connect the dots between the two that show that one or other of the stories must be wrong.
We might be at the right level of smoking regulation
Usually the costs of something rise as you do or have more of it, while its benefits fall. So unless the cost of the first unit is already very high or the benefits of the first unit are already very low there is some amount greater than zero that it is optimal to have or do. This is true for an individual and for a society. The first car you have massively changes your life. The second one adds pleasure, variety, and a good deal of practicality in some situations, but it's much less useful than the first. Nearly no one has three cars to themselves because the benefits are vanishingly small and the cost is rising for storage, upkeep reasons.
The first few million cars in a society the size of the UK are amazingly useful, the next million go to people who don't need them as much but do add to congestion, pollution and so on. The forty-millionth or sixty-millionth car starts taking up way more space than it's worth.
Well as the title suggests I think it's possible we might be approaching (or already have gone past) this point when it comes to smoking regulation. I wrote before about how plain packaging was probably a mirage (incidentally I learned today that the UK would drop three whole places, from 2nd to 5th on the GIPC's index of intellectual property rights protections if it passed it; though many of this blog's readers probably wouldn't mind that).
We may have needed very activist laws in the past because it really is quite difficult to inform people about anything really, and smoking is dangerous in important ways such that if you don't understand the decision you're taking you really could make your life a lot worse. But nowadays people may well even overestimate the costs and people are thus taking the right decisions to maximise social utility. It isn't a question of externalities because smokers actually cost the state less by dying earlier (a bad thing!)
This isn't just a musing. A new paper in the Journal of Cost-Benefit Analysis ("Retrospective and Prospective Cost-Benefit Analysis of US Anti-Smoking Policies" (pdf) by Lawrence Jin, Donald S. Kenkel, Feng Liu & Hua Wang) says roughly the same.
We use a dynamic population model to make counterfactual simulations of smoking prevalence rates and cigarette demand over time. In our retrospective BCA (Benefit-Cost Analysis) the simulation results imply that the overall impact of antismoking policies from 1964 – 2010 is to reduce total cigarette consumption by 28 percent.
At a discount rate of 3 percent the 1964-present value of the consumer benefits from anti-smoking policies through 2010 is estimated to be $573 billion ($2010). Although we are unable to develop a hard estimate of the policies’ costs, we discuss evidence that suggests the consumer benefits substantially outweigh the costs.
We then turn to a prospective BCA of future anti-smoking FDA regulations. At a discount rate of 3 percent the 2010-present value of the consumer benefits 30 years into the future from a simulated FDA tobacco regulation is estimated to be $100 billion. However, the nature of potential FDA tobacco regulations suggests that they might impose additional costs on consumers that make it less clear that the net benefits of the regulations will be positive.
Especially cool is the chart of a rational level of cigarette smoking, based on the benefits to the smoker of fulfilling their preferences vs the costs of frustrating their preferences for better health and a longer life.
I'm not saying that this research is conclusive. One paper is one paper. But I think it's getting to the point where further cigarette regulation is becoming intrusive and costly without necessarily producing large benefits to its purported targets. We should consider if we've maybe hit the cigarette regulation sweet spot.
Who rules Britain: how much of our law comes from Brussels?
Business for Britain was right, on 2nd March, to question the proportion of our laws that comes from Brussels. Nigel Farage says it is 78%, Nick Clegg 7% and the House of Commons Library 13.2% but that is also an understatement due to the Library’s omission of no less than 49,699 EU Regulations, during the same 21 years to 2014. EU Regulations are not approved by Parliament and thereby escape the Library’s attention. From that, Business for Britain concluded that 65.7% of our legislation comes from Brussels. The figures, in fact, get murkier because the Library also seems to have omitted up to 2,000 statutory instruments a year, which would offset most of the swing. SIs are the UK equivalents of EU regulations: both are secondary or “delegated” legislation and cover a broad range of rules from laws in the full sense to temporary road closures. SIs can even be used to repeal primary legislation.
The proportion from Brussels is really beside the point, namely the total number of rules both from Brussels and Whitehall. Governments claim they will staunch the flow but little is done. Surely by now we must have enough laws?
Curiously, so far as business regulation is concerned, Whitehall is the bigger offender. In 1972 we signed up to a Common Market. That is the one bit of the EU we all like and let us hope that, and not much else, survives the EU renegotiation. A single market must have a single set of rules governing that market. You cannot have a single market if everyone makes their own. The market-maker is the EU and it is no more a loss of sovereignty to conform to their rules than, say, playing by the club’s rules when one joins a poker club. Sovereignty is being able to opt out.
Business, like poker, is competitive so it is crazy to add ones own rules, hobbling one’s own business, to those required by the club. Telling the others at the table that you will never raise on, say, two pairs, stacks the odds against you. For this reason, counter-intuitively, it would be best if 100% of business regulation came from Brussels.
If a regulation is needed in the UK then we should ensure Brussels adopts it for the rest of the single market. If the others think it is unnecessary, we should think again. Rather than dreaming up its own business regulations, Whitehall should be staunching the 4,000 a year flow from Brussels and ensure that what does get through will deliver the open, fair and competitive single market we need.
Not only can we ditch all UK business regulations not required by the EU, but, with all that new free time at their disposal, our civil servants can be out and about in the capitals of Europe developing best practice, closer working relationships and, in consequence the simplest and best set of rules. In this game, fewer is better as anyone who witnessed the FSA contribution to the banking crisis can testify. Indeed, they will not need desk space in Whitehall, probably the most expensive in Europe, any longer.
There is little truth in widespread view that we must accept EU legislation without demur, beyond fine tuning directive-based legislation a bit. The European Scrutiny Committee of MPs “assesses the legal/political importance of EU documents, deciding which are debated, monitoring the activities of UK Ministers in the Council and keeping developments in the EU under review.” In other words, it is supposed to be briefed with EU Regulations in draft and seek to amend those not in the UK interest. How often does it do that? Hardly ever is probably a generous estimate. When that doughty EU fighter, Sir William Cash, became chairman, some of us hoped for action, but no, he was overcome by the same torpor as overwhelmed his predecessors.
In short, Business for Britain are right to complaint about the excess of regulation from Brussels but we should complain even louder about the excess from Whitehall and Parliament’s spineless defence of British business.
Why we shouldn't clamp down on zero-hour contracts
The Office for National Statistics has revealed that 697,000 people (about 2.26% of employees) are on zero-hours contracts in their main job, up more than 100,000 on a year ago. Such contracts make life uncertain for the employees concerned, who may not know from week to week, or even from day to day, whether they have paying work. Some 33% of those on zero-hours contracts say they would like to work more. So should we be clamping down on zero-hours contracts? No, we should not.
First, it is absolutely correct that zero-hours contracts have become far more common in the last two or three years. They hovered at about 0.5% for most of the period since 2000. They rose in use quite slowly between 2005 and 2012, then shot up to just under 2% in 2013 and to that 2.26% figure in 2014.
However, the unemployment rate has also come down in the last two or three years as well. In 2011 it stood at over 8%. Now it is less than 6%, and seemingly headed steadily down. Even though zero-hours contracts represent only a very small part of the labour force, it seems reasonable to argue that the two trends are related. The economic outlook is brighter, but is still uncertain; businesses remain unsure about the future, unsure about their markets, unsure of how much they should invest, unsure of how many workers they can justify taking on. A bust-up in the eurozone, for example, or a general election that delivers an unfavourable or unworkable government. might change the outlook completely for many UK businesses. So the only way that they can rationally expand their production, and be ready if things really do boom, it so cut their employment risk. Hence zero-hours contracts.
Remember too that even though the ONS talks about people's 'main' job, they might not be the only income earners in a household. The same is true of those on the minimum wage: many of them will be secondary earners. In fact, 34% of those on zero-hours contracts are aged 16-24 and half of those are in full time education. To them, a minimum wage job or a zero-hours contract, while frustrating, is not a disaster, and the extra income, however low or intermittent, is welcome.
Critics – you know who – say that the government has allowed a 'low-pay culture' to go 'unchecked'. So what would be their solution? Ban zero-hours contracts? Raise the minimum wage yet further? The inevitable result would be that employers would no longer be willing to take the risk of employing so many people. And first to go would be young people, with fewer skills and less understanding of workplace culture than more experienced employees, and secondary earners, often women. There would be fewer 'starter' jobs through which young and unskilled people could gain experience, more young people trapped in benefits, and a rise in unemployment more generally.
What will do in zero-hours contracts, of course, is continuing economic growth. As unemployment falls, businesses will find it harder to attract employees, and workers and potential workers can become more choosy about the jobs they take. Zero-hours contracts will once again become a very small part of the employment market. Growth, employment, greater security. Job done, and not a politician in sight.
The Lord's Digital Agenda
On Tuesday the House of Lords Select Committee on Digital Skills released the 144-page report ‘Make or Break: The UK’s Digital Future’. It’s a typical government report, calling for ‘immediate and extensive action’ in something or other — and in this case, unifying government's current, disjointed digital initiatives with the launch of a grand ‘Digital Agenda’. (This masterplan includes such fabulous ideas as the middle-aged men in central government ‘future-proofing our young people’ through things like bolting-on a digital element to all apprenticeship schemes.) One of the report’s most newsworthy findings was London’s poor broadband speed, comparative to other European capitals. In a ranking of their average download speed London came 26th — nestled between Warsaw & Minsk —whilst the likes of Bucharest, Paris and Stockholm topped the chart. London also came 38th in a rating of the UK's cities’ speeds (although it's worth noting that Bolton, the UK’s fastest city, would make the European capital ranking’s Top 10). The Lord's report is also concerned with the persistence of internet ‘not spots’ in urban areas, universal internet coverage and the rollout of superfast broadband. In response, it calls on the government to classify the internet as a utility service, with the desirable goal of universal online access.
It goes without saying how vital digital connectivity is to the modern economy, as well as the importance of staying internationally competitive. However, a new, centrally-dictated ‘Digital Agenda’ is probably quite an ineffectual and expensive way of boosting the digital economy.
Despite the House of Lords' fears about the speed of superfast broadband rollout, coverage has increased from 55-60% of the UK in 2013, to 70-75% in 2014. And, whilst the report holds up Cape Town as an example of a city providing universal broadband, this won’t be ready until 2030. In the time it takes for the state to roll out the chosen digital infrastructure, it may already be out of date. Whilst many are still choosing between regular or fibre optic broadband, landline-free 4G home broadband is the latest offering to hit London. At the same time, eyes are already on 5G, and the new capabilities it can bring.
Treating the internet as a public utility is also problematic from a free-market standpoint. Doing so could, for example, lead to calls for more government involvement in the deployment and update of internet infrastructure. However, a study by the Mercatus Centre looked at American municipal government investment in broadband networks across 80 cities, and found that for the billions of dollars of public money spent, there was little community or economic benefit.
It’s also the type of thinking which has led to America's ‘Net Neutrality’ debate, where, on the behest of Obama, the Federal Communications Commission has proposed to regulate internet service providers as 'common carriers', and in doing so, subject the net to a 20th century public utility law originally devised to deal with the telephone monopoly. Ostensibly designed to protect consumers from the creation of ‘anti-competitive’ internet fast lanes for big content producers, Net Neutrality legislation threatens not only the speed, price and quality of internet provision, but the autonomy of ISPs and investment at the core of the net.
Whilst the Lord's proposed 'Digital Agenda' might seem far-removed from such heavy-handed state activity, a government who considers it their duty to take online and 'digitally educate' every single citizen risks heading down an increasingly interventionist and expensive path.