Madsen Pirie Madsen Pirie

Stalin and remembrance

On October 30th, 1961, the Soviet Party Congress voted unanimously that Joseph Stalin’s embalmed body should be removed from inside Lenin’s tomb in the Kremlin, and buried nearby with a plain marker. It was a continuation of the de-Stalinization process begun in 1956 when Nikita Khrushchev gave his secret speech to a closed session of the 20th Congress of the Soviet Communist Party. In that speech, Khrushchev had denounced the late Soviet leader for his wide-scale repression and personality cult.

Revered as a godlike figure in his day, Stalin was in fact a thug and mass murderer. Historian still dispute how many millions he killed, but all agree that it was many millions. The historian Timothy Snyder has used Soviet archives, opened after the collapse of the Soviet Union, to conclude that the figure was 9 million, including 6 million he personally ordered killed. His estimate is lower than that of other historians who have used demographic and census figures to suggest that the archive records are incomplete and underestimate the true death toll by a considerable amount.

Of those executed or deliberately starved to death in the socialist republics over whose union Stalin presided, some put the figure as high as 20 million. This figure includes the known atrocities committed outside of Russia itself, such as the murder of 22,000 Polish officers in the woods at Katyn in Poland.

It is a coincidence that the same day of the year, October 30th, is also the official Day of Remembrance of the Victims of Political Repression under communist rule in the USSR. It was officially declared as such in 1991, and is marked by solemn memorials every year. The date is an official public holiday in the Russian Federation, when people can contemplate the crimes that the socialist regimes perpetrated.

People gather at the Solovetsky Stone, a monument commemorating the victims. It features a stone from the Solovki concentration camp, now installed in front of the old KGB headquarters and prison at the Lubyanka Square in Moscow. Although he was himself a KGB officer in his younger days, President Putin has appeared alongside the Patriarch and Primate of the Russian Orthodox Church to pay respect and homage to the victims. Sometimes people recite the names of some of those who died.

The scale and brutality of the socialist empire of Eastern and Central Europe staggers the memory and the imagination. But it happened, and it ruthlessly destroyed the lives of millions in support of its poisonous ideology.

So that today’s generation, many of whom are oblivious to its sheer wickedness, might be made aware of what it did, work is in progress for museums to be established in Washington DC and in London, to record and to present some of its horrors. Gulag victims who survived, and relatives of those who did not, have recorded memories that bring home the stark reality of its evil. The museums will provide a timely reminder of the crimes that the Soviet system perpetrated. It is to be hoped that school parties will tour them, as some visit the Nazi death camps, to remind today’s young people of the horrors that a perverted ideology can and did unleash.

Read More
Tim Worstall Tim Worstall

The Reverend Malthus Prize for being absolutely 100% correct

An informal and occasional prize to be sure but one we think worthwhile.

We generally think of the Reverend Malthus as being entirely wrong. After all, the last two centuries have been proof perfect that technological advance can indeed raise the living standards of all, substantially and sustainably, rather than just turning up as more people living in the same old subsistence style. The thing is though, Rev. Malthus was entirely correct for all of history up to the moment he sat down to write. Other than this past couple of hundred years it was true that economic growth turned up as just more people. In parts of the world it continued to be - India’s GDP was very much larger when the British left than when they arrived. But that it was spread across some four times as many people meant that GDP per capita hardly budged. That is, it’s not that Malthusian growth stopped happening, it’s that non-Malthusian growth started, in some places, happening.

Which gives us the outline of the prize. Someone who was entirely correct given history and knowledge to that time. But who managed to write down, insist upon, that truth at just the moment is stopped being true.

At which point we give you Jean Gimpel, from his lovely little book “The Medieval Machine”. The recording of how the early middle ages, rather than the later renaissance, incorporated the sort of technological advance that led to the later Industrial Revolution we find most interesting. But it’s this from the preface that qualifies for the prize:

We are witnessing a sharp arrest in technological impetus, save in the military field: it was in the declining Middle Ages that the cannon was developed, Innovations - that is, inventions that have been financed, tested and made commercially available - are few and far between, a fact particularly remarkable in the pharmaceutical industry. Even computers have not spread into every home in the country, as was forecast. Like every previous civilisation, we have reached a plateau.

While I hope that the reader of The Medieval Machine will want to pursue his own comparisons, I must point out one alarming contrast. The economic depression that struck Europe in the fourteenth century was followed ultimately by economic and technological recovery. But the depression we have moved into will have no end, We can anticipate centuries of decline and exhaustion. There will be no further industrial revolution in the cycles of our Western civilisation.

It’s that this was published in 1976 which qualifies it. It’s entirely true that the computer was not in every home at this point. Yet Intel was founded in 1968, they created the first commercial microprocessor in 1971, Apple was founded in 1976, the Apple I released that year, the Apple II - actually useful - in 1977. The TRS -80 and Commodore PET in 1977, the IBM PC in 1981. The mobile phone, which is by far the most widespread computer today, was demonstrated in 1973, Arpanet was up and running by 1971.

That statement about computers and technological advance was made at the exact moment that it became untrue.

We welcome further entries for this very occasional prize…..

Read More
Charlie Paice Charlie Paice

The AI Economy by Roger Bootle

The AI Economy by Roger Bootle is a very good read. Roger is Chairman of Capital Economics, Europe's largest macroeconomic consultancy and he also writes a column for The Daily Telegraph. 

Throughout the book, Bootle gives a clear introduction to the relevant topics surrounding AI, from the different types of universal basic income to the supporting arguments and critiques of Piketty's work on inequality. His style involves clearly laying out theories and their critiques, sometimes even in bullet point lists, before giving his own judgement.  

The book ranges from discussions over the possibility of deficient demand impacting on bond yields to what school subjects will be desired in the future. This makes it an informative book for everyone from traders to students. 

He remains optimistic throughout the book whilst still recognizing the challenges that the new AI economy introduces, steering a middle course between the predictions of Utopia and Dystopia often associated with the subject to offer a realistic and grounded vision of the future.  For example, while confident that advances in technology will deliver higher living standards, as they have done throughout history, the immediate impact will likely make people worse off due to structural unemployment. Nonetheless, he is adamant that “AI will not spell Armageddon for employment.” 

He is helped by his decision to leave discussion of 'The Singularity' (when AI becomes fully conscious) until the end. If we do reach this point, which Bootle suggests that we never will, then it would be almost impossible to predict what would happen next as human Labour quickly becomes extinct and humanity is either destroyed by technology or assimilated into it. 

On the role of the state in the development of AI, Bootle gives a fairly free market approach. Although he believes the government should neither tax nor subsidise progress in this field, he gives a strong case for drawing up an appropriate legal framework as well as restrictions on development of AI in areas that could be used by criminals or for weapons. 

When it comes to education, he identifies an increasing importance in the humanities and how they develop critical thinking skills. He believes that technology will dramatically improve the education sector from freeing up teachers from more mundane tasks to how online classes can bring education to a wider audience.

He moves on to cover UBI which is probably the most discussed policy area when thinking about the AI economy. Bootle argues that any implementation of UBI would be unacceptably low or unacceptably expensive and thus unfeasible. However, although he covers negative income taxes earlier in the specific chapter it might have been good if he could then have come back to them to either offer them as an alternative to the unfeasible UBI or to give an argument on why it would also be unfeasible. His only criticism of negative income taxes, which he credits to Friedman himself, was simply that they incentivise people to lie about how much they earn in order to get more money. This incentive however already exists in any system that has direct taxation and it also fairly hard to do with the PAYE system. He may have been more comfortable in not covering this area because he predicts that technological change will not necessarily either decrease inequality nor remove work - at least until the singularity. We all hope that this to be true but as he himself says, this may be an area that could do with some more academic research. 

He makes the amusing observation of what we view as AI shifts all the time and much of what we consider normal computer processes now would have been considered 'AI' a few years ago. Although it is difficult to predict the future, Bootle does give a very grounded and undoubtedly realistic view of the future. This, however, is overlooking the real value of the book, which is to enable the reader to enter the debate themselves by introducing the key concepts and issues. As an introduction to the debate as a whole, this book is invaluable.



Read More
Madsen Pirie Madsen Pirie

Black Tuesday crashed the world's economy

The fourth day of the Great Stock Market Crash was October 29th, 1929, known to history as "Black Tuesday." The Dow Jones Industrial Average dropped 25% over the four days, and investors lost $30 billion. This was 10 times more than the 1929 federal budget, and more than the United States spent on World War I.

The market collapse brought an end to the decade of exuberance and excess known as the Roaring Twenties. The optimism that followed World War I had seen a huge expansion of American Industry, coupled with an agricultural decline caused by massive overproduction. People had flocked to the cities to share in the new prosperity, and invested their savings in the ever-rising stock market, rather than in more secure long-term assets. Money was easy, and brokers let people buy on credit. People thought the market would rise forever, but it didn't.

Following the Great Crash, the Dow saw another slide, a longer one, from April 1930 to July 1932, when it hit its lowest point of the 20th Century, representing a loss of 89% for all stocks. It heralded the beginning of the 12-year Great Depression that affected all Western industrialized countries. Bankruptcies followed in the wake of the stock price declines, and credit contraction induced business closures, bank failures, mass lay-offs, and a collapse of consumer confidence.

It was a time of great misery as businesses and farms closed and people's savings disappeared overnight. There were suicides, and New York hotel clerks were reportedly asking guests, "Do you want a room for sleeping or jumping?" As people deserted stocks and went for commodities such as gold, the Fed tried to boost the value of the dollar by raising interest rates. This move was also motivated by a belief that easy credit had led to the over-indulgence of stock buying that had ultimately precipitated the collapse.

There is still dispute about what caused the crash, but the analysis by Milton Friedman and Anna Schwartz in "A Monetary History of the United States," is now widely accepted as the reason that it led to the Great Depression. That analysis points to the collapse of the banking system during three waves of panics from 1930 to 1933 as the source of the credit famine that brought about bank failures and business and farm closures.

As the Financial Crisis of 2008 unfolded, those in authority decided to learn from the mistakes of history, rather than to repeat them. They unloosed the purse strings with quantitative easing and ultra-low interest rates, and no second Great Depression manifested itself. While it is too soon to be completely confident, it looks as though the crisis might have been contained. It has, however, left a great deal of extra money in the economy, distorting people’s view of real demand, and encouraging unjustified investment and misallocation of resources. It will take some delicate handling to remove it at a pace that does not trigger another collapse.

Read More
Tim Worstall Tim Worstall

A bit early for panto season but "Oh No It Doesn't!"

We’re told that the gender pay gap blows out to an incredible 28% for women in their 50s.

No, really, it doesn’t:

The analysis by Rest Less, a jobs, volunteering and advice site for the over-50s, reveals that both sexes reach their peak full-time salaries in their 40s: for women this is £34,665 and for men it is £46,213, a difference of £11,548 or 25%.

But the data shows that the gap grows again over the next decade, with the mean average salary for a woman in her 50s, working full-time, being £32,052 compared with £44,561 for a man in his 50s, a difference of £12,509 or 28%.

As we, the ONS, even the Statistics Ombudsman, have been pointing out over the years we should not - because it is entirely misleading - be using mean wages. Instead, median tells us what it is that we want to know. What also rather worries is this:

The gap between men and women working part-time is 8% for workers aged between 18 and 21. This rises slightly to almost 10% for those in their 20s and again to 12% for those in their 30s.

They’ve got the sign wrong. As ONS points out the part time gender pay gap is in favour of women.

As we like to remind people from time to time we’ll never be able to solve a problem unless we start by identifying reality. Which we should probably start doing.

Read More
Madsen Pirie Madsen Pirie

When Faraday broke the mould

Michael Faraday recorded in his laboratory diary on October 28th, 1831, that he was "making many experiments with the great magnet of the Royal Society." He demonstrated the link between magnetism and electricity, and went on to invent the dynamo, precursor of the electric motor, as well as laying the groundwork for the understanding of electromagnetic radiation.

The young Michael Faraday, born in Southwark in 1791, was the third of four children born to a blacksmith who'd just moved from Westmorland. Faraday, having only the most basic school education, had to educate himself by reading books, and devoured scientific books.

When he was 20, Faraday attended lectures by the renowned chemist, Humphry Davy of the Royal Institution and the Royal Society. When he sent Davy a 300-page bound copy of the notes he'd taken, Davy employed him as his personal assistant, and as Chemical assistant at the Royal Institution.

Faraday made a major discovery when he wrapped two insulated coils of wire around an iron ring, and found that upon passing a current through one coil, a momentary current was induced in the other coil. This phenomenon is now known as mutual induction. He deduced that there was only one type of electricity, contrary to the prevailing view that there were several, and he showed that changing the values of the quantity and the intensity – today's current and voltage – would produce different effects.

Faraday's work changed the world, and lies behind today's power generation, and his discoveries and developments underpin every electric motor in use today. He was remarkably modest. He refused a knighthood, having a religious objection to earthly rewards, and declined the offer of burial in Westminster Abbey, though there is a plaque honouring him near Newton's memorial.

Faraday undertook a series of Christmas lectures for young people at the Royal Institution, a series that continues today. Their aim was to present science to the general public in the hopes of inspiring them. His lectures were described as joyful and juvenile, and he would fill soap bubbles with various gasses (in order to determine whether or not they are magnetic) in front of his audiences, and demonstrated the rich colors of polarized light.

Albert Einstein used to have Michael Faraday's picture in his office, along with Isaac Newton and James Clerk Maxwell. There is a statue of Faraday in Savoy Place, London, outside the Institution of Engineering and Technology, and innumerable laboratories and university buildings are named after him.

Faraday showed that someone from a humble background, but someone with a thirst for discovery and an ambition to find out how the universe works, could write important pages in the catalogue of knowledge. He never patented any of his discoveries, leaving them open source for humankind, and he never tried to make money from them. He was quoted as saying: “I can at any moment convert my time into money, but I do not require more of the latter than is sufficient for necessary purposes.”

Humankind could use more Faradays to improve its condition and enlarge its choices, and there ought to be ways in which we can spot potential ones early and ensure that they are not held back by pressures to conform and to fall in with consensus. Faraday was a mould breaker, and we should strive for an educational system that encourages young minds to emulate him.

Read More
Tim Worstall Tim Worstall

Micro-houses aren't the solution to homelessness, no

The Observer provides a fine example of Betteridge’s Law for us:

Are micro-houses the solution to Britain’s homelessness crisis?

The answer is, of course, no:

With a single bed, a chemical toilet and a phone charger in a very tight space, a “micro sleeping pod” is very much basic living. But for those that live in the tight shelters, set up by a charity, they provide a link between living on the streets and finding more long-term accommodation. And then there is the cost – at £10,500 for the pair, the pods in Newport, Wales are significantly cheaper than building a flat, according to Amazing Grace Spaces.

From Bristol to London, architects, planners and charities are developing unique styles of accommodation to cope with the housing crisis. With private rents increasing and the local housing allowance frozen until at least 2020, homelessness continues to spiral – last year Shelter said at least 320,000 people were homeless in Britain, up 4% on the previous year.

The reason is our definition of homelessness. Rough sleeping concerns somewhere between 5,000 and 10,000 people. The lower figure around the number on any one night, the higher those passing through that status over a year.

That rough sleeping number containing two classes. Those who pass through the status - runaways, those evicted possibly and so on - and gain housing rather quickly. Certainly, we would prefer that none pass through this status but no system is going to be perfect. The truth being that those whose only problem is nowhere to stay do get found somewhere to stay rapidly.

The second class - perhaps the habitual - near all suffer from one of more of the problems of significant varied addictions or mental health problems. Their problem isn’t a shortage of housing, it’s being able to stay, being competent to stay, in housing once it is made available. More housing simply isn’t a solution to this set of problems. Reversing Care in the Community might be.

We might also observe in passing that when for profit economic actors convert old office blocks into rather more spacious - and connected to the mains sewerage - living spaces they are decried as spiv rentiers rather than brave charitable fighters against the evils of our age. But then, you know, propaganda.

But the real reason the headline confirms Betteridge’s Law. Shelter will still define — they have to, it’s how they reach that 300,000 number — people living in micro-houses as homeless. Therefore micro-houses cannot be the solution to homelessness by the definition being used, can they?

Read More
Madsen Pirie Madsen Pirie

China passes a billion

On October 27th, 1982, the People's Republic of China announced that its population had passed one billion, which at the start of the 19th Century was estimated to be the population of the whole world. That figure has risen steady following the Industrial Revolution and the development of modern medicine and sanitation enabled by the wealth it generated. It passed 3 billion in 1959, and since the 1970s has risen by a billion roughly very 12 years. It passed 6 billon in 1999, then 7 billion in 2011, and currently stands at 7.742 billion.

Some commentators doubt the planet can support such numbers. Thomas Malthus predicted starvation because a finite amount of land could not support a population that could grow without limit. Paul and Anne Erlich, latter-day Malthusians, published "The Population Bomb" in 1968, predicting widespread famine in the 1970s and 80s. There have been many calls for the world to limit its population growth, and China responded with its 'one child' policy in 1979, a policy that lasted, with modifications, until 2015. The policy caused many social problems and was widely unpopular. It led to sex-selective abortions and even infanticide, given the Chinese cultural preference for at least one male child. Sir David Attenborough is among the voices that today call stridently for limits to be imposed.

Obviously, an increased population will need more food and more drinkable water. It will consume more resources, need more houses, light more fires, drive more cars, eat more meat. Yet the result so far is that the world has largely been able to cope. The Green Revolution of the 1960s grew food faster than population increases, and as the world has been growing richer following globalized free trade, fertility has declined. In most European countries the birth rate is below the renewal rate, and it is estimated that by 2030, this will apply to two-thirds of the world's countries.

The transfer of advanced-country technology to poorer ones means that fewer children die in infancy, and mothers respond by having fewer children. As countries grow richer through industrialization and globalization, people no longer need the economic contribution of children to the family budget, and can afford to put them into education instead of work. As countries can afford social services such as pensions, parents no longer depend on their children to support them in old age. Education, particularly of women, leads to declining fertility rates.

Population growth is slowing. Although alarmist projections speak of 20 billion by 2050, and even 50 billion by 2100, The UN population forecast of 2017 suggested that the world's population is flattening, like a Kuznets Curve, as we near the "end of high fertility." Population studies suggest the world's numbers might stabilize at around 10 billion, before declining from that figure.

The question is whether the world can support that number, and the answer seems to be yes. Despite the increases in population, the number of people living in extreme poverty globally shows a stable decline. The global number of famine-related deaths has declined, and food supply per person has increased with population growth. New technologies are emerging to enable those numbers to be fed with less land, and to purify seawater to drinking quality in large quantities. Renewable energy sources will cut the emissions that people produce when they work and travel, and the products people use are being made with fewer natural resources.

It is by no means a Pollyanna picture, and could be upset, for example, by medical breakthroughs that greatly increase lifespan, or by political conflicts that thwart the spread of new technologies. But the balance so far is on the side of Julian Simon, who regarded humans as 'The Ultimate Resource,' capable of solving their problems, and it is against the doomsayers who treat human beings as if they were a form of pollution.

Read More
Tim Worstall Tim Worstall

The State is not your friend

A little puzzlement here:

I had a visit from my benefits assessor – and now I fear the state more than poverty

Rob Palk

We’ll not try to vouch for the story but it’s one of the State being incompetent at determining who is eligible, who should be eligible, for that help from the rest of us that it is said state’s responsiblity to dispense.

Well, OK.

No, not that’s OK. But the statement, OK, and?

For those who rail against that incompetence at disability assessment are those who also insist the state must do more. The very people who tell us that the welfare part of the state is somewhere between bad and incompetent at taking care of the poor and needy are the very same people who tell us that the government should also be taking on the entire transport system. Owning and running the electricity and other utility systems. Managing the economy, deciding what should be invested in and to what amount. Deciding what the appropriate car technology is 11 years ahead.

That’s what puzzles us. The shouting is that the state is incompetent at giving away free money. Why therefore the insistence that they should be doing more of more complicated things?

All of us having had real lives we tend to think that those who can’t do things be asked to do less.

Read More
Madsen Pirie Madsen Pirie

The Lords became a halfway house

On October 26th, 1999, The UK’s House of Lords voted to reform itself by abolishing the right to vote of most of its hereditary peers. It did not completely abolish the hereditary principle, because out of about 750 hereditary peers, 92 were allowed to continue to sit and vote in the upper chamber. They were elected by their party colleagues in the Lords, with their numbers allocated according to the party strength at the time. When one of the 92 dies or retires, their place is filled by having their colleagues chose a successor from the non-voting hereditary peers.

The life peers, appointed by the Prime Minister, but with quotas agreed in practice for other parties, and proportional to their party’s strength in the Commons, continue to sit. There are thought to be too many of them to constitute a viable assembly since, when added to the hereditary peers, there are over 800 of them.

The 1999 Act was a patchwork quilt compromise put together by Tony Blair and Viscount Cranborne, then Leader of the House. It was a deal they thought would get through Parliament successfully, and they were correct. But it left the upper chamber as a weird mix, a halfway house towards reform, and it was expected that a more thoroughgoing reform would be carried later to simplify it. Now, 20 years later, that reform has not materialized.

One problem is that the Commons, the popular chamber, is reluctant to see an upper house with an elected mandate that could challenge its authority. The Lords functions mostly as a revising chamber, and the Commons are jealous of their superior authority and status. Critics also point out that an elected upper chamber would return members according to party allegiances, and the Lords would lose its members currently appointed in recognition of their specialist talents. Furthermore, if the upper house were elected at the same time as the Commons, it would tend to feature similar party allegiances and be a virtual rubber stamp.

Political leaders in the Commons are said to be reluctant to give up the patronage they currently enjoy by being able to offer life peerages as a reward for loyalty or financial support. Members of Parliament might be less ready to quit the Commons without the prospect of a continuing political career on the red benches of the upper house.

Some analysts have proposed a mixture of appointed peers and elected ones, but critics of the idea point out that a Prime Minister could appoint life peers to outnumber the elected ones and thus diminish their influence.

The UK has been left with a strangely-composed upper house that is difficult to justify in theory, but which has worked reasonably well in practice for the past two decades. Obviously, it will be reformed at some stage in the future, but the complexities are such that no government has yet had the appetite to take on that task.

Read More
Your subscription could not be saved. Please try again.
Your subscription has been successful.

Blogs by email