Madsen Pirie Madsen Pirie

When Faraday broke the mould

Michael Faraday recorded in his laboratory diary on October 28th, 1831, that he was "making many experiments with the great magnet of the Royal Society." He demonstrated the link between magnetism and electricity, and went on to invent the dynamo, precursor of the electric motor, as well as laying the groundwork for the understanding of electromagnetic radiation.

The young Michael Faraday, born in Southwark in 1791, was the third of four children born to a blacksmith who'd just moved from Westmorland. Faraday, having only the most basic school education, had to educate himself by reading books, and devoured scientific books.

When he was 20, Faraday attended lectures by the renowned chemist, Humphry Davy of the Royal Institution and the Royal Society. When he sent Davy a 300-page bound copy of the notes he'd taken, Davy employed him as his personal assistant, and as Chemical assistant at the Royal Institution.

Faraday made a major discovery when he wrapped two insulated coils of wire around an iron ring, and found that upon passing a current through one coil, a momentary current was induced in the other coil. This phenomenon is now known as mutual induction. He deduced that there was only one type of electricity, contrary to the prevailing view that there were several, and he showed that changing the values of the quantity and the intensity – today's current and voltage – would produce different effects.

Faraday's work changed the world, and lies behind today's power generation, and his discoveries and developments underpin every electric motor in use today. He was remarkably modest. He refused a knighthood, having a religious objection to earthly rewards, and declined the offer of burial in Westminster Abbey, though there is a plaque honouring him near Newton's memorial.

Faraday undertook a series of Christmas lectures for young people at the Royal Institution, a series that continues today. Their aim was to present science to the general public in the hopes of inspiring them. His lectures were described as joyful and juvenile, and he would fill soap bubbles with various gasses (in order to determine whether or not they are magnetic) in front of his audiences, and demonstrated the rich colors of polarized light.

Albert Einstein used to have Michael Faraday's picture in his office, along with Isaac Newton and James Clerk Maxwell. There is a statue of Faraday in Savoy Place, London, outside the Institution of Engineering and Technology, and innumerable laboratories and university buildings are named after him.

Faraday showed that someone from a humble background, but someone with a thirst for discovery and an ambition to find out how the universe works, could write important pages in the catalogue of knowledge. He never patented any of his discoveries, leaving them open source for humankind, and he never tried to make money from them. He was quoted as saying: “I can at any moment convert my time into money, but I do not require more of the latter than is sufficient for necessary purposes.”

Humankind could use more Faradays to improve its condition and enlarge its choices, and there ought to be ways in which we can spot potential ones early and ensure that they are not held back by pressures to conform and to fall in with consensus. Faraday was a mould breaker, and we should strive for an educational system that encourages young minds to emulate him.

Read More
Tim Worstall Tim Worstall

Micro-houses aren't the solution to homelessness, no

The Observer provides a fine example of Betteridge’s Law for us:

Are micro-houses the solution to Britain’s homelessness crisis?

The answer is, of course, no:

With a single bed, a chemical toilet and a phone charger in a very tight space, a “micro sleeping pod” is very much basic living. But for those that live in the tight shelters, set up by a charity, they provide a link between living on the streets and finding more long-term accommodation. And then there is the cost – at £10,500 for the pair, the pods in Newport, Wales are significantly cheaper than building a flat, according to Amazing Grace Spaces.

From Bristol to London, architects, planners and charities are developing unique styles of accommodation to cope with the housing crisis. With private rents increasing and the local housing allowance frozen until at least 2020, homelessness continues to spiral – last year Shelter said at least 320,000 people were homeless in Britain, up 4% on the previous year.

The reason is our definition of homelessness. Rough sleeping concerns somewhere between 5,000 and 10,000 people. The lower figure around the number on any one night, the higher those passing through that status over a year.

That rough sleeping number containing two classes. Those who pass through the status - runaways, those evicted possibly and so on - and gain housing rather quickly. Certainly, we would prefer that none pass through this status but no system is going to be perfect. The truth being that those whose only problem is nowhere to stay do get found somewhere to stay rapidly.

The second class - perhaps the habitual - near all suffer from one of more of the problems of significant varied addictions or mental health problems. Their problem isn’t a shortage of housing, it’s being able to stay, being competent to stay, in housing once it is made available. More housing simply isn’t a solution to this set of problems. Reversing Care in the Community might be.

We might also observe in passing that when for profit economic actors convert old office blocks into rather more spacious - and connected to the mains sewerage - living spaces they are decried as spiv rentiers rather than brave charitable fighters against the evils of our age. But then, you know, propaganda.

But the real reason the headline confirms Betteridge’s Law. Shelter will still define — they have to, it’s how they reach that 300,000 number — people living in micro-houses as homeless. Therefore micro-houses cannot be the solution to homelessness by the definition being used, can they?

Read More
Madsen Pirie Madsen Pirie

China passes a billion

On October 27th, 1982, the People's Republic of China announced that its population had passed one billion, which at the start of the 19th Century was estimated to be the population of the whole world. That figure has risen steady following the Industrial Revolution and the development of modern medicine and sanitation enabled by the wealth it generated. It passed 3 billion in 1959, and since the 1970s has risen by a billion roughly very 12 years. It passed 6 billon in 1999, then 7 billion in 2011, and currently stands at 7.742 billion.

Some commentators doubt the planet can support such numbers. Thomas Malthus predicted starvation because a finite amount of land could not support a population that could grow without limit. Paul and Anne Erlich, latter-day Malthusians, published "The Population Bomb" in 1968, predicting widespread famine in the 1970s and 80s. There have been many calls for the world to limit its population growth, and China responded with its 'one child' policy in 1979, a policy that lasted, with modifications, until 2015. The policy caused many social problems and was widely unpopular. It led to sex-selective abortions and even infanticide, given the Chinese cultural preference for at least one male child. Sir David Attenborough is among the voices that today call stridently for limits to be imposed.

Obviously, an increased population will need more food and more drinkable water. It will consume more resources, need more houses, light more fires, drive more cars, eat more meat. Yet the result so far is that the world has largely been able to cope. The Green Revolution of the 1960s grew food faster than population increases, and as the world has been growing richer following globalized free trade, fertility has declined. In most European countries the birth rate is below the renewal rate, and it is estimated that by 2030, this will apply to two-thirds of the world's countries.

The transfer of advanced-country technology to poorer ones means that fewer children die in infancy, and mothers respond by having fewer children. As countries grow richer through industrialization and globalization, people no longer need the economic contribution of children to the family budget, and can afford to put them into education instead of work. As countries can afford social services such as pensions, parents no longer depend on their children to support them in old age. Education, particularly of women, leads to declining fertility rates.

Population growth is slowing. Although alarmist projections speak of 20 billion by 2050, and even 50 billion by 2100, The UN population forecast of 2017 suggested that the world's population is flattening, like a Kuznets Curve, as we near the "end of high fertility." Population studies suggest the world's numbers might stabilize at around 10 billion, before declining from that figure.

The question is whether the world can support that number, and the answer seems to be yes. Despite the increases in population, the number of people living in extreme poverty globally shows a stable decline. The global number of famine-related deaths has declined, and food supply per person has increased with population growth. New technologies are emerging to enable those numbers to be fed with less land, and to purify seawater to drinking quality in large quantities. Renewable energy sources will cut the emissions that people produce when they work and travel, and the products people use are being made with fewer natural resources.

It is by no means a Pollyanna picture, and could be upset, for example, by medical breakthroughs that greatly increase lifespan, or by political conflicts that thwart the spread of new technologies. But the balance so far is on the side of Julian Simon, who regarded humans as 'The Ultimate Resource,' capable of solving their problems, and it is against the doomsayers who treat human beings as if they were a form of pollution.

Read More
Tim Worstall Tim Worstall

The State is not your friend

A little puzzlement here:

I had a visit from my benefits assessor – and now I fear the state more than poverty

Rob Palk

We’ll not try to vouch for the story but it’s one of the State being incompetent at determining who is eligible, who should be eligible, for that help from the rest of us that it is said state’s responsiblity to dispense.

Well, OK.

No, not that’s OK. But the statement, OK, and?

For those who rail against that incompetence at disability assessment are those who also insist the state must do more. The very people who tell us that the welfare part of the state is somewhere between bad and incompetent at taking care of the poor and needy are the very same people who tell us that the government should also be taking on the entire transport system. Owning and running the electricity and other utility systems. Managing the economy, deciding what should be invested in and to what amount. Deciding what the appropriate car technology is 11 years ahead.

That’s what puzzles us. The shouting is that the state is incompetent at giving away free money. Why therefore the insistence that they should be doing more of more complicated things?

All of us having had real lives we tend to think that those who can’t do things be asked to do less.

Read More
Madsen Pirie Madsen Pirie

The Lords became a halfway house

On October 26th, 1999, The UK’s House of Lords voted to reform itself by abolishing the right to vote of most of its hereditary peers. It did not completely abolish the hereditary principle, because out of about 750 hereditary peers, 92 were allowed to continue to sit and vote in the upper chamber. They were elected by their party colleagues in the Lords, with their numbers allocated according to the party strength at the time. When one of the 92 dies or retires, their place is filled by having their colleagues chose a successor from the non-voting hereditary peers.

The life peers, appointed by the Prime Minister, but with quotas agreed in practice for other parties, and proportional to their party’s strength in the Commons, continue to sit. There are thought to be too many of them to constitute a viable assembly since, when added to the hereditary peers, there are over 800 of them.

The 1999 Act was a patchwork quilt compromise put together by Tony Blair and Viscount Cranborne, then Leader of the House. It was a deal they thought would get through Parliament successfully, and they were correct. But it left the upper chamber as a weird mix, a halfway house towards reform, and it was expected that a more thoroughgoing reform would be carried later to simplify it. Now, 20 years later, that reform has not materialized.

One problem is that the Commons, the popular chamber, is reluctant to see an upper house with an elected mandate that could challenge its authority. The Lords functions mostly as a revising chamber, and the Commons are jealous of their superior authority and status. Critics also point out that an elected upper chamber would return members according to party allegiances, and the Lords would lose its members currently appointed in recognition of their specialist talents. Furthermore, if the upper house were elected at the same time as the Commons, it would tend to feature similar party allegiances and be a virtual rubber stamp.

Political leaders in the Commons are said to be reluctant to give up the patronage they currently enjoy by being able to offer life peerages as a reward for loyalty or financial support. Members of Parliament might be less ready to quit the Commons without the prospect of a continuing political career on the red benches of the upper house.

Some analysts have proposed a mixture of appointed peers and elected ones, but critics of the idea point out that a Prime Minister could appoint life peers to outnumber the elected ones and thus diminish their influence.

The UK has been left with a strangely-composed upper house that is difficult to justify in theory, but which has worked reasonably well in practice for the past two decades. Obviously, it will be reformed at some stage in the future, but the complexities are such that no government has yet had the appetite to take on that task.

Read More
Tim Worstall Tim Worstall

Shouldn't we welcome a decline in prosecutions?

This rather puzzles us:

The number of drivers fined for using mobile phones has fallen to a record low, amid fears there are not enough police patrols to catch offenders.

Some 38,600 fixed penalty notices were issued by police to offenders in 2018 compared with 53,000 during the previous 12 months, according to Home Office data.

This is the lowest amount since current records began in 2011.

Fewer people being fined could be because the same level of offences is happening yet fewer are being caught at it. Or, of course, it could be that the incidence of the offence is lower:

AA President Edmund King suggested more offenders would be deterred from using their phones and caught if there were more police patrols.

Well, yes, but we still need to know whether the decline is because of not catching or not doing.

The Home Office data also shows that the number of fines issued for not wearing a seat belt rose by 17 per cent last year, while there was a 5 per cent increase in speeding tickets.

Careless driving fines excluding mobile use were up 20 per cent.

We seem to be catching other people committing other offences in greater numbers. It seems unlikely that it’s a lack of catching going on therefore.

Fines for using mobile phones at wheel at record low amid concerns there are not enough police to catch drivers

If the incidence of the offence is falling then what’s the complaint again?

Read More
Madsen Pirie Madsen Pirie

Missing the last bus

The last horse-drawn bus operated by the London General Omnibus Company in central London ran on October 25th, 1911, between London Bridge and Moorgate Street. In August 1914, the last horse-drawn bus that ran anywhere in London ceased operation, and many of the horses were subsequently used for war service.

It had been a distinguished story, with the first regular horse-bus service started by George Shillibeer in 1829. The first ones could take 16-18 passengers in a single-deck vehicle pulled by three horses. He ran four or five services a day between Paddington Green and the Bank, charging a fare of one shilling (5p).

The services proliferated, and by 1888/89 the two Underground companies, the Metropolitan and District lines, provided horse buses as feeder services for their stations, selling through tickets that covered the bus and the tube trip. They ran for about 15 hours a day, usually with passengers facing each other on long bench-like seats that ran parallel to the direction of travel. The driver sat on a forward-facing bench. Side windows enabled the lower deck passengers to see out, but the upper deck travellers were exposed to the elements.

They were mainly patronized by the middle classes, because from the 1870s the working classes used the horse tramways with their cheaper fares. Typically, the horse buses were drawn by two horses, had smaller front wheels than rear ones, and were boarded by steps at the back. A third horse would normally be added on hilly routes.

Those who like to hark back to the days before London became polluted with motor vehicle exhausts might reflect on the pollution caused by horses. In 1900 London had several thousand horse-drawn buses, each needing 12 horses a day. To these were added 11,000 horse-drawn hansom cabs making a total of 50,000 horses.

Since each horse produced about 15 - 35 lb of manure a day, this meant that about 600 tons of manure was being deposited each day on London’s streets. It attracted flies that spread typhoid and other diseases. Small children would ply street corners, and for a small coin would sweep the road ahead of gentlemen and ladies so their clothes would not be soiled. Each horse also produced about 2 pints of urine a day, and when their working life was over - usually after about 3 years, their bodies had to be disposed of. Sometimes they were left to rot on the streets.

There was a Great Manure Crisis of 1894, when the scale of the problem drew concerned attention. The Times newspaper predicted that “In 50 years, every street in London will be buried under nine feet of manure.” But the solution was not far off. In September, 1901, the same paper announced the introduction of “a service of motor-cars…to carry passengers, at omnibus fares, between Piccadilly Circus and Putney.” The new motorized transport was cheaper and more efficient, since the engines only had to be fed when working. Within a few years it displaced the horse for passenger transport in the metropolis. With the horses went the excrement, the urine, and the carcasses.

Technology solved the problem of horse pollution without people needing to change their bahaviour or limit their travel. It replaced horse pollution with exhaust fume pollution, but that, too is being solved by the technological innovation of the electric car. Had there been Extinction Rebellion protesters in 1900, they would doubtless have campaigned to ban horses and their manure from the streets, and to have people travel less. Such a campaign was not needed because innovative modes of transport made it unnecessary. As always, humans are quite good at solving problems.

Read More
Tim Worstall Tim Worstall

We're sorry but we don't actually believe this

A new explanation for what actually happened before and during the Industrial Revolution. Wages were rising all along, it being annual incomes that increased as a result of longer work years.

Economists Jane Humphries and Jacob Weisdorf have uncovered new evidence to show that modern economic growth started in the late 16th Century – 200 years earlier than previously thought. They also argue that Britain’s early economic growth was driven by having longer work years.

In research published in the October 2019 issue of The Economic Journal, the authors challenge the widely held view, based on wage rates paid to British day labourers, that western societies began to grow rich as late as the 19th Century.

They re-estimate the starting point of modern economic growth by collecting wage data for historical British workers who worked for an annual stipend rather than the daily wages used by previous researchers.

The paper itself is here.

The reason we don’t believe it? The division is into paid labour and leisure. This is the same mistake that modern commentators make about the average workweek today. We’re all still working 40 hour weeks, what happened to Keynes’ promise of 15 hours?

The bit that’s missing is that we all work very much fewer hours - sufficient reduction to meet Keynes’ prediction in fact - in unpaid household labour.

We do not insist that the early modern explanation is the same but we’d want to see at least consideration of the point. Early medieval farming life was based upon a large measure of household production for consumption within the household. One way of looking at the post-Black Death wage economy - combined with enclosure and so on - is the replacement of that household labour with market labour.

Missing this gives odd results. This paper - and many before it - tries to say that the villein was working perhaps 150 days a year. Commentators like Greg Clark and Juliet Schur indicate that peasants had 70 holidays a year, confusing holiday with Holy Day. This misses that said villeins did some market work and some household. They might farm their own 30 acres (say) and also have to do work on the Lord’s land by way of rent payment. Counting only that work done for cash and rent misses all of that household production, both in terms of the work year and also consumption.

One obvious point is that animal owning peasants who only work 150 days a year rapidly become non-animal owning peasants. The initial claim about the work year doesn’t make logical sense that is.

What does make sense is if we divide the year, correctly, into the current four divisions. Personal time, household production, market labour, leisure. What this paper, as so many others, is measuring is the substitution of market labour time for that household production. Yes, this will lead to greater incomes but as a derivative, from the greater efficiency of market over household production, not from an increase in labour hours themselves.

Another way to make much the same point. What was that first major advance in that Industrial Revolution? Spinning. Whose labour was replaced by that Jenny? The household labour of most of the women of the country. For that’s where spinning was done, it was a major labour occupation but it is never recorded as being market labour when done inside the household.

As when we measure working hours today we must include both household and market labour so too must we when looking at history. Without doing so the past just doesn’t make sense.

The usual estimation is that the average working year rose to 3,000 hours or so when the machines arrived. It’s only possible to assume the rise if we ignore all the work the peasants had been doing in their own fields, their own houses. It being that household labour that declined - exactly the same as the story of this past century.

Read More
Tim Ambler Tim Ambler

Can’t afford the opera? Your GP will prescribe a visit.

Matt Hancock took time out on Wednesday to announce a new National Academy.  It turns out that this has nothing to do with conventional academia.  “Social prescribing” is having the NHS pay for our leisure pursuits – as if it did not cost us enough already. Our Health Secretary was “setting out his ambition for every patient in the country to have access to social prescribing schemes on the NHS as readily as they do medical care.

Social prescribing involves helping patients to improve their health, wellbeing and social welfare by connecting them to community services. This can include activities such as art and singing classes.”

Social prescribing, in a small way, has been around since the 1990s. More than 100 schemes exist in the UK, 25% or so in London. The General practice forward view (2016) proposed that the NHS appointed a national champion for social prescribing and said (p.28): “we will also work with CCGs to ensure they institute plans to address patient flows in their area using tried and tested ideas such as access hubs, social prescribing and evidence based minor ailment schemes.”

The Kings Fund is more sceptical: “robust and systematic evidence on the effectiveness of social prescribing is very limited. Many studies are small scale, do not have a control group, focus on progress rather than outcomes, or relate to individual interventions rather than the social prescribing model. Much of the evidence available is qualitative, and relies on self-reported outcomes. Researchers have also highlighted the challenges of measuring the outcomes of complex interventions, or making meaningful comparisons between very different schemes.”  The claimed a 38% (in some areas) reduction in the use of hospital A&E units for example, which makes one wonder why those people were attending A&E in the first place if all they wanted was a little socialising.

There has been no serious quantification of the costs versus the benefits, something one might have expected a cash-strapped NHS to undertake before rolling a National Academy out across the country.  Nor has there been any comparative study of which types of social prescribing are most effective for which conditions.  There is just a generic claim that social prescribing is good for patients.  Guinness is probably better.  How much faith would one put in a GP who said “medicine would help your condition but I have no idea which one”?

The announcement of the National Academy had no evidence to support it, and no expected financial benefits. “The indepedent [sic] academy will receive £5 million of government funding”, presumably per annum, but that ignores the boosting of social prescribing from 60% to 100% of the NHS. Inter alia, “The National Academy for Social Prescribing will work to increase awareness of the benefits of social prescribing by building and promoting the evidence base.” How one-sided is that?

If the objective of the new academy was to reduce the costs of the NHS, and/or increase the benefits at the same cost, we should take it seriously.  But it is not.  The objective is solely “to help more people benefit from arts, sport and leisure activities across the country.”  Matt Hancock is jumping on a fashionable bandwagon with no evidential support and throwing away taxpayer money in the process.

Read More
Madsen Pirie Madsen Pirie

To boldly go into an optimistic future

Gene Roddenberry, who left us on October 24th, 1991, had several successes as a freelance TV scriptwriter. He was involved in several hit series, including Highway Patrol, which I used to watch as a teenager, Have Gun Will Travel, which I saw as a university student and, of course, for the series that will forever be associated with his name, Star Trek, and the later series, Star Trek, the Next Generation.

The show was not a brilliant success, and NBC planned to close it after its second season, but a determined campaign by devoted ‘trekkies’ led them to air a third season. It was hailed as TV’s first ‘adult’ (meaning non-childish) science fiction series, and the surprise was that its hero, William Shatner’s Captain Kirk, was overshadowed by Leonard Nimoy’s emotion-free Vulcan, Mr Spock. Indeed, Roddenberry wrote to SF author Isaac Asimov to seek advice on how to counter this. Asimov suggested having Kirk and Spock work together as a team "to get people to think of Kirk when they think of Spock."

The optimism of the Star Trek universe was part of its appeal. Humanity was headed out to the stars not to conquer and exploit, but to explore and to make friends. Its introduction became famous.

“Space: the final frontier. These are the voyages of the starship Enterprise. Its five-year mission: to explore strange new worlds. To seek out new life and new civilizations. To boldly go where no man has gone before!”

Fans forgave the polystyrene rocks and the obviously-humans-in-costume that represented aliens, and loved Star Trek’s technology. They loved phasers (set to stun), tricorders and transporters. This was ever more true for the sequel series, The Next Generation, that featured holodecks and replicators.

In Star Trek, the Next Generation, the replicators satisfied material needs, so the series could concentrate on the character development of the main players. Humankind had apparently broken free of superstition (including, apparently, religion, which was remarkable for a show written principally for a US audience). Racism and nationalism had been superseded by an affinity with all life-forms. Conflicts, potential and actual, were resolved for the most part by peaceful diplomacy, though there was the occasional steel behind the apple pie - “Let’s speak to them in a language everybody understands. Arm photon torpedoes!”

The series pictured a better future that people yearned for, one in which people would no longer strive for material gain, but for honour, and one characterized by constant outward reaching to learn new things. The final frontier calls to mind Frederick Jackson Turner’s 1893 Frontier in American History thesis. In Star Trek it is the space frontier that shapes humanity’s values.

There have been big-budget motion pictures, spin-off series that still continue, and annual conventions at which trekkies pay homage to their heroes. Overwhelmingly, though, it is the optimism that lingers. The vision of a better, calmer, but still challenging future draws us today as it did then, inspiring in many people the idea that if we want badly enough to have it happen, we can make it happen, make it so.

Read More
Your subscription could not be saved. Please try again.
Your subscription has been successful.

Blogs by email