Blogroll

I read blogs, as well as write one. The 'blogroll' on this site reproduces some posts from some of the people I enjoy reading.

Disclaimer: Reproducing an article here need not necessarily imply agreement or endorsement!

Beta: MySQL Governor updated

CloudLinux - Thu, 18/01/2018 - 13:13

A new updated MySQL Governor version 1.2-31 is available for download from our updates-testing repository.

Changelog:

governor-mysql 1.2-31

  • fixed dbtop refresh interval;
  • fixed parsing of my.cnf error.

To update run:

$ yum update governor-mysql --enablerepo=cloudlinux-updates-testing $ service db_governor restart

To install run:

$ yum install governor-mysql --enablerepo=cloudlinux-updates-testing $ /usr/share/lve/dbgovernor/mysqlgovernor.py --install
Categories: Technology

LVE Manager updated

CloudLinux - Thu, 18/01/2018 - 12:37

A new updated LVE Manager package is available for download from our production repository.

Changelog:

lvemanager-3.0-38

  • LVEMAN-1215: fixed syntax in /etc/cl.selector/php.conf for old installations of LVE Manager.

To update run:

yum update lvemanager
Categories: Technology

However improbable: The story of a processor bug

CloudFlare - Thu, 18/01/2018 - 12:06

Processor problems have been in the news lately, due to the Meltdown and Spectre vulnerabilities. But generally, engineers writing software assume that computer hardware operates in a reliable, well-understood fashion, and that any problems lie on the software side of the software-hardware divide. Modern processor chips routinely execute many billions of instructions in a second, so any erratic behaviour must be very hard to trigger, or it would quickly become obvious.

But sometimes that assumption of reliable processor hardware doesn’t hold. Last year at Cloudflare, we were affected by a bug in one of Intel’s processor models. Here’s the story of how we found we had a mysterious problem, and how we tracked down the cause.

Sherlock_holmes_pipe_hat-1
CC-BY-SA-3.0 image by Alterego

Prologue

Back in February 2017, Cloudflare disclosed a security problem which became known as Cloudbleed. The bug behind that incident lay in some code that ran on our servers to parse HTML. In certain cases involving invalid HTML, the parser would read data from a region of memory beyond the end of the buffer being parsed. The adjacent memory might contain other customers’ data, which would then be returned in the HTTP response, and the result was Cloudbleed.

But that wasn’t the only consequence of the bug. Sometimes it could lead to an invalid memory read, causing the NGINX process to crash, and we had metrics showing these crashes in the weeks leading up to the discovery of Cloudbleed. So one of the measures we took to prevent such a problem happening again was to require that every crash be investigated in detail.

We acted very swiftly to address Cloudbleed, and so ended the crashes due to that bug, but that did not stop all crashes. We set to work investigating these other crashes.

Crash is not a technical term

But what exactly does “crash” mean in this context? When a processor detects an attempt to access invalid memory (more precisely, an address without a valid page in the page tables), it signals a page fault to the operating system’s kernel. In the case of Linux, these page faults result in the delivery of a SIGSEGV signal to the relevant process (the name SIGSEGV derives from the historical Unix term “segmentation violation”, also known as a segmentation fault or segfault). The default behaviour for SIGSEGV is to terminate the process. It’s this abrupt termination that was one symptom of the Cloudbleed bug.

This possibility of invalid memory access and the resulting termination is mostly relevant to processes written in C or C++. Higher-level compiled languages, such as Go and JVM-based languages, use type systems which prevent the kind of low-level programming errors that can lead to accesses of invalid memory. Furthermore, such languages have sophisticated runtimes that take advantage of page faults for implementation tricks that make them more efficient (a process can install a signal handler for SIGSEGV so that it does not get terminated, and instead can recover from the situation). And for interpreted languages such as Python, the interpreter checks that conditions leading to invalid memory accesses cannot occur. So unhandled SIGSEGV signals tend to be restricted to programming in C and C++.

SIGSEGV is not the only signal that indicates an error in a process and causes termination. We also saw process terminations due to SIGABRT and SIGILL, suggesting other kinds of bugs in our code.

If the only information we had about these terminated NGINX processes was the signal involved, investigating the causes would have been difficult. But there is another feature of Linux (and other Unix-derived operating systems) that provided a path forward: core dumps. A core dump is a file written by the operating system when a process is terminated abruptly. It records the full state of the process at the time it was terminated, allowing post-mortem debugging. The state recorded includes:

  • The processor register values for all threads in the process (the values of some program variables will be held in registers)
  • The contents of the process’ conventional memory regions (giving the values of other program variables and heap data)
  • Descriptions of regions of memory that are read-only mappings of files, such as executables and shared libraries
  • Information associated with the signal that caused termination, such as the address of an attempted memory access that led to a SIGSEGV

Because core dumps record all this state, their size depends upon the program involved, but they can be fairly large. Our NGINX core dumps are often several gigabytes.

Once a core dump has been recorded, it can be inspected using a debugging tool such as gdb. This allows the state from the core dump to be explored in terms of the original program source code, so that you can inquire about the program stack and contents of variables and the heap in a reasonably convenient manner.

A brief aside: Why are core dumps called core dumps? It’s a historical term that originated in the 1960s when the principal form of random access memory was magnetic core memory. At the time, the word core was used as a shorthand for memory, so “core dump” means a dump of the contents of memory.

coremem-2
CC BY-SA 3.0 image by Konstantin Lanzet

The game is afoot

As we examined the core dumps, we were able to track some of them back to more bugs in our code. None of them leaked data as Cloudbleed had, or had other security implications for our customers. Some might have allowed an attacker to try to impact our service, but the core dumps suggested that the bugs were being triggered under innocuous conditions rather than attacks. We didn’t have to fix many such bugs before the number of core dumps being produced had dropped significantly.

But there were still some core dumps being produced on our servers — about one a day across our whole fleet of servers. And finding the root cause of these remaining ones proved more difficult.

We gradually began to suspect that these residual core dumps were not due to bugs in our code. These suspicions arose because we found cases where the state recorded in the core dump did not seem to be possible based on the program code (and in examining these cases, we didn’t rely on the C code, but looked at the machine code produced by the compiler, in case we were dealing with compiler bugs). At first, as we discussed these core dumps among the engineers at Cloudflare, there was some healthy scepticism about the idea that the cause might lie outside of our code, and there was at least one joke about cosmic rays. But as we amassed more and more examples, it became clear that something unusual was going on. Finding yet another “mystery core dump”, as we had taken to calling them, became routine, although the details of these core dumps were diverse, and the code triggering them was spread throughout our code base. The common feature was their apparent impossibility.

There was no obvious pattern to the servers which produced these mystery core dumps. We were getting about one a day on average across our fleet of servers. So the sample size was not very big, but they seemed to be evenly spread across all our servers and datacenters, and no one server was struck twice. The probability that an individual server would get a mystery core dump seemed to be very low (about one per ten years of server uptime, assuming they were indeed equally likely for all our servers). But because of our large number of servers, we got a steady trickle.

In quest of a solution

The rate of mystery core dumps was low enough that it didn’t appreciably impact the service to our customers. But we were still committed to examining every core dump that occurred. Although we got better at recognizing these mystery core dumps, investigating and classifying them was a drain on engineering resources. We wanted to find the root cause and fix it. So we started to consider causes that seemed somewhat plausible:

We looked at hardware problems. Memory errors in particular are a real possibility. But our servers use ECC (Error-Correcting Code) memory which can detect, and in most cases correct, any memory errors that do occur. Furthermore, any memory errors should be recorded in the IPMI logs of the servers. We do see some memory errors on our server fleet, but they were not correlated with the core dumps.

If not memory errors, then could there be a problem with the processor hardware? We mostly use Intel Xeon processors, of various models. These have a good reputation for reliability, and while the rate of core dumps was low, it seemed like it might be too high to be attributed to processor errors. We searched for reports of similar issues, and asked on the grapevine, but didn’t hear about anything that seemed to match our issue.

While we were investigating, an issue with Intel Skylake processors came to light. But at that time we did not have Skylake-based servers in production, and furthermore that issue related to particular code patterns that were not a common feature of our mystery core dumps.

Maybe the core dumps were being incorrectly recorded by the Linux kernel, so that a mundane crash due to a bug in our code ended up looking mysterious? But we didn’t see any patterns in the core dumps that pointed to something like this. Also, upon an unhandled SIGSEGV, the kernel generates a log line with a small amount of information about the cause, like this:

segfault at ffffffff810c644a ip 00005600af22884a sp 00007ffd771b9550 error 15 in nginx-fl[5600aeed2000+e09000]

We checked these log lines against the core dumps, and they were always consistent.

The kernel has a role in controlling the processor’s Memory Management Unit to provide virtual memory to application programs. So kernel bugs in that area can lead to surprising results (and we have encountered such a bug at Cloudflare in a different context). But we examined the kernel code, and searched for reports of relevant bugs against Linux, without finding anything.

For several weeks, our efforts to find the cause were not fruitful. Due to the very low frequency of the mystery core dumps when considered on a per-server basis, we couldn’t follow the usual last-resort approach to problem solving - changing various possible causative factors in the hope that they make the problem more or less likely to occur. We needed another lead.

The solution

But eventually, we noticed something crucial that we had missed until that point: all of the mystery core dumps came from servers containing The Intel Xeon E5-2650 v4. This model belongs to the generation of Intel processors that had the codename “Broadwell”, and it’s the only model of that generation that we use in our edge servers, so we simply call these servers Broadwells. The Broadwells made up about a third of our fleet at that time, and they were in many of our datacenters. This explains why the pattern was not immediately obvious.

This insight immediately threw the focus of our investigation back onto the possibility of processor hardware issues. We downloaded Intel’s Specification Update for this model. In these Specification Update documents Intel discloses all the ways that its processors deviate from their published specifications, whether due to benign discrepancies or bugs in the hardware (Intel entertainingly calls these “errata”).

The Specification Update described 85 issues, most of which are obscure issues of interest mainly to the developers of the BIOS and operating systems. But one caught our eye: “BDF76 An Intel® Hyper-Threading Technology Enabled Processor May Exhibit Internal Parity Errors or Unpredictable System Behavior”. The symptoms described for this issue are very broad (“unpredictable system behavior may occur”), but what we were observing seemed to match the description of this issue better than any other.

Furthermore, the Specification Update stated that BDF76 was fixed in a microcode update. Microcode is firmware that controls the lowest-level operation of the processor, and can be updated by the BIOS (from system vendor) or the OS. Microcode updates can change the behaviour of the processor to some extent (exactly how much is a closely-guarded secret of Intel, although the recent microcode updates to address the Spectre vulnerability give some idea of the impressive degree to which Intel can reconfigure the processor’s behaviour).

The most convenient way for us to apply the microcode update to our Broadwell servers at that time was via a BIOS update from the server vendor. But rolling out a BIOS update to so many servers in so many data centers takes some planning and time to conduct. Due to the low rate of mystery core dumps, we would not know if BDF76 was really the root cause of our problems until a significant fraction of our Broadwell servers had been updated. A couple of weeks of keen anticipation followed while we awaited the outcome.

To our great relief, once the update was completed, the mystery core dumps stopped. This chart shows the number of core dumps we were getting each day for the relevant months of 2017:

coredumps

As you can see, after the microcode update there is a marked reduction in the rate of core dumps. But we still get some core dumps. These are not mysteries, but represent conventional issues in our software. We continue to investigate and fix them to ensure they don’t represent security issues in our service.

The conclusion

Eliminating the mystery core dumps has made it easier to focus on any remaining crashes that are due to our code. It removes the temptation to dismiss a core dump because its cause is obscure.

And for some of the core dumps that we see now, understanding the cause can be very challenging. They correspond to very unlikely conditions, and often involve a root cause that is distant from the immediate issue that triggered the core dump. For example, we see segfaults in LuaJIT (which we embed in NGINX via OpenResty) that are not due to problems in LuaJIT, but rather because LuaJIT is particularly susceptible to damage to its data structures by bugs in unrelated C code.

Excited by core dump detective work? Or building systems at a scale where once-in-a-decade problems can get triggered every day? Then join our team.

Categories: Technology

Reign of Appearances

Peter Leithart - Thu, 18/01/2018 - 12:00
Mention the “public,” and you’re liable to be greeted with lamentation and hand-wringing. Citizenship isn’t what it used to be. No one participates in public events any more. Once upon a time, we were active citizens. Now we bowl alone and we participate in public life only as passive spectators. Democracy is dying, if it’s […]
Categories: People I don't know

FinTech, Robo Advisers, and the Soul of Swiss Banking

Mises Institute - Thu, 18/01/2018 - 12:00
By: Marcia Christoff-Kurapovna
ubs.PNG

This year, the European Union will start to force continental banks to open their customer interface to third-party providers. However, Swiss banking doesn’t intend to follow suit. Such is the latest development in the fight over the soul of Swiss banking, one pitting the trend-mad robo-ification of (digital) banking against the steadfast anchor and sail of the traditional Swiss model.

To its central banking fans, the payment-services-directive ‘2’ (PSD2) of the EU is being seen as a landmark decision: banks in the EU will have to open their interfaces to third-party providers of services — and that includes Fintechs (which have indeed taken root in the Confoederatio Helvetica). PSD2 is quite the controversial ream of red tape because external service providers may in theory access information about bank clients — a No if there ever was one in Swiss-minded money-mindedness. Proponents claim that customers “will profit from the rule through the best available service” and, of course, “the most innovative products”. In Switzerland, Hypothekarbank Lenzburg and Postfinance proclaimed to be favorable toward such “open banking” regulation. Most rivals however remained skeptical, calling it, by way of the Swiss Bankers’ Association, “an economic thought-experiment with substantial risk.”

To understand what is at stake in the Swiss tradition here, one must first understand the global industry trend towards the all-out digitalization of banking; how far it has come, and why the old school Swiss bankers remain cautious, if not outright put-off. 

Readers of this site are well aware that banks around the world are cutting jobs as the industry is undergoing a sea-change transformation by digital technology, and by the increasing application of artificial intelligence and robotics. Vikram Pandit, who ran Citigroup Inc., has predicted some thirty percent of banking jobs will disappear over the next five years. Fintech hubs — cities where startups, talent, and funding come together — are spreading globally alongside this ongoing disruption in the contemporary culture of financial services. Such hubs are competing to become autonomous fintech centers in their own right, and to be at the foundation of what will eventually constitute the financial services “ecosystem” of the future.

Thus, it is no exaggeration to state that society has entered what is perhaps the most significant era of change for financial services companies since the 1970s brought about index mutual funds, discount brokers, and ATMs. To be sure, few, if any, banks, wealth management firms, or any financial services organization will be immune from this kind of disruption. As one Swiss economics journal put it, the most contentious conflicts (and partnerships) will be between “startups that are completely reengineering decades-old practices, traditional power players who are furiously trying to adapt with their own innovations and total disruption of established technology & processes”. That is to say:

  • Traditional Retail Banks vs. Online-Only Banks: Traditional retail banks will always retain the cachet of security and stability. Online-only banks, however, are asserting themselves more aggressively in claiming to offer the same services with higher rates and lower fees.
  • Traditional Lenders vs. Peer-to-Peer Marketplaces: P2P lending marketplaces are growing faster than traditional lenders—only time will tell if the banks strategy of creating their own small loan networks will be successful

  • Traditional Asset Managers vs. Robo-Advisors: Companies like Betterment feature robo-advisors offer lower fees and lower minimums; meanwhile, the larger traditional asset managers are creating their own robo-products while providing the kind of personalized attention for which high net worth clients are willing to pay quite generously.
  • Traditional Wealth Management vs. Automated Advice: a plethora of new software platforms and apps feature digital options, including mobile telephone payment services, automated wealth management advice, price comparison apps, tailored social media groups and crowdfunding systems. On the other side, the exclusivity of one-on-one attention is forever and very possibly will take on even more cachet as the somewhat sterile egalitarianism of digital banking erodes the cultural hierarchy of status.
  • Traditional Clearing Systems vs. Blockchain. This latter can store and distribute crypto-currencies (such as Bitcoin) and digital contracts (such as land deeds) without the need for banks or formal clearing systems. Proponents of Blockchain maintain that it promises “to reduce fees, improve security and bypass the volatility of central bank controlled fiat currencies”. Major technology firms such as Google, Amazon and Alibaba are also joining this trend.

But are these developments really desirable or even practical in the long term? Switzerland is a key case study here. To begin with, yes, Switzerland in a general sense wants its share of the global Fintech pie with most other advanced economies. A number of players have set up in the so-called crypto-valley around the Zug, the capital of the eponymous canton in central Switzerland. The Swiss Financial Market Supervisory Authority (FINMA) has sought to ease up regulations for these “small and innovative” players. In March, that organization eased rules on verifying new clients by allowing video and online identification. FINMA has also backed the idea of a special Fintech banking license.

However, the Swiss have been critical about the appeal, or lack thereof, of a banking industry that emphasizes convenience over caution; novelty over reputable experience, and market trend over cultural tradition. Then, too, there is simply the question of practical efficiency regarding these Fintech start-ups given the fact that they are still dependent upon established players for access. Thus, a major problem facing Fintechs is the development of a customer base that makes the business worthwhile. One company, Truewealth, an online wealth manager, had to agree to a deal with BLKB (Basellschaftliche Kantonalbank, the cantonal bank of Basel) to access a satisfactory client base. At Descartes Finance, another Swiss robo-adviser, cooperating with long-established asset managers is part of the business model. Additiv, a Fintech developer, has had to bring on board a well-known Swiss investor, Herr Martin Ebner, to help them finance their expansion.

Such caution is, indeed, the Swiss Way of Wealth and robo advisers in Switzerland will continue to face difficulties in acquiring assets, despite the low fees charged. Skepticism in the country about the reliability of the technology is still too strong and many potential investors have other priorities. 

As I have written here, what maintains a kind of “stealth” popularity in the country is that prized national product, the Private Banker (as prized as a beat-up alpine barn stocked with gold bars and lakes one can drink out of). Long the rock in the storm and refuge away from globaloney banking, the banquier privé is a special creature managed by partners with unlimited liability on their commercial and personal wealth for the bank’s obligations. For the Swiss, it expresses the idea of free enterprise, independence, personal service traditions, bean-counter competence and, most of all, long-term performance over short-term gains. It is world away from the tempest of “news” headlines, and private bankers have pretty much remained faithful to the values that have always guided them: in the endlessly shifting world of finance, they still do set the benchmark. Christian Rahn, of Rahn & Bodner, one such private banker, is a stalwart defender of this model of “integrity, stability and know-how” despite every possible pressure to go the full-blown robo-digital route.

"We have faced more problems in the past 263 years than what is going on now," said Herr Rahn of Rahn & Bodmer, a family bank founded in 1755, in an interview with the Swiss press at the height of the IRS investigations into Swiss accounts. "We will survive this well."



Categories: Current Affairs

FinTech, Robo Advsiors, and the Soul of Swiss Banking

Mises Institute - Thu, 18/01/2018 - 12:00
By: Marcia Christoff-Kurapovna
ubs.PNG

This year, the European Union will start to force continental banks to open their customer interface to third-party providers. However, Swiss banking doesn’t intend to follow suit. Such is the latest development in the fight over the soul of Swiss banking, one pitting the trend-mad robo-ification of (digital) banking against the steadfast anchor and sail of the traditional Swiss model.

To its central banking fans, the payment-services-directive ‘2’ (PSD2) of the EU is being seen as a landmark decision: banks in the EU will have to open their interfaces to third-party providers of services — and that includes Fintechs (which have indeed taken root in the Confoederatio Helvetica). PSD2 is quite the controversial ream of red tape because external service providers may in theory access information about bank clients — a No if there ever was one in Swiss-minded money-mindedness. Proponents claim that customers “will profit from the rule through the best available service” and, of course, “the most innovative products”. In Switzerland, Hypothekarbank Lenzburg and Postfinance proclaimed to be favorable toward such “open banking” regulation. Most rivals however remained skeptical, calling it, by way of the Swiss Bankers’ Association, “an economic thought-experiment with substantial risk.”

To understand what is at stake in the Swiss tradition here, one must first understand the global industry trend towards the all-out digitalization of banking; how far it has come, and why the old school Swiss bankers remain cautious, if not outright put-off. 

Readers of this site are well aware that banks around the world are cutting jobs as the industry is undergoing a sea-change transformation by digital technology, and by the increasing application of artificial intelligence and robotics. Vikram Pandit, who ran Citigroup Inc., has predicted some thirty percent of banking jobs will disappear over the next five years. Fintech hubs — cities where startups, talent, and funding come together — are spreading globally alongside this ongoing disruption in the contemporary culture of financial services. Such hubs are competing to become autonomous fintech centers in their own right, and to be at the foundation of what will eventually constitute the financial services “ecosystem” of the future.

Thus, it is no exaggeration to state that society has entered what is perhaps the most significant era of change for financial services companies since the 1970s brought about index mutual funds, discount brokers, and ATMs. To be sure, few, if any, banks, wealth management firms, or any financial services organization will be immune from this kind of disruption. As one Swiss economics journal put it, the most contentious conflicts (and partnerships) will be between “startups that are completely reengineering decades-old practices, traditional power players who are furiously trying to adapt with their own innovations and total disruption of established technology & processes”. That is to say:

  • Traditional Retail Banks vs. Online-Only Banks: Traditional retail banks will always retain the cachet of security and stability. Online-only banks, however, are asserting themselves more aggressively in claiming to offer the same services with higher rates and lower fees.
  • Traditional Lenders vs. Peer-to-Peer Marketplaces: P2P lending marketplaces are growing faster than traditional lenders—only time will tell if the banks strategy of creating their own small loan networks will be successful

  • Traditional Asset Managers vs. Robo-Advisors: Companies like Betterment feature robo-advisors offer lower fees and lower minimums; meanwhile, the larger traditional asset managers are creating their own robo-products while providing the kind of personalized attention for which high net worth clients are willing to pay quite generously.
  • Traditional Wealth Management vs. Automated Advice: a plethora of new software platforms and apps feature digital options, including mobile telephone payment services, automated wealth management advice, price comparison apps, tailored social media groups and crowdfunding systems. On the other side, the exclusivity of one-on-one attention is forever and very possibly will take on even more cachet as the somewhat sterile egalitarianism of digital banking erodes the cultural hierarchy of status.
  • Traditional Clearing Systems vs. Blockchain. This latter can store and distribute crypto-currencies (such as Bitcoin) and digital contracts (such as land deeds) without the need for banks or formal clearing systems. Proponents of Blockchain maintain that it promises “to reduce fees, improve security and bypass the volatility of central bank controlled fiat currencies”. Major technology firms such as Google, Amazon and Alibaba are also joining this trend.

But are these developments really desirable or even practical in the long term? Switzerland is a key case study here. To begin with, yes, Switzerland in a general sense wants its share of the global Fintech pie with most other advanced economies. A number of players have set up in the so-called crypto-valley around the Zug, the capital of the eponymous canton in central Switzerland. The Swiss Financial Market Supervisory Authority (FINMA) has sought to ease up regulations for these “small and innovative” players. In March, that organization eased rules on verifying new clients by allowing video and online identification. FINMA has also backed the idea of a special Fintech banking license.

However, the Swiss have been critical about the appeal, or lack thereof, of a banking industry that emphasizes convenience over caution; novelty over reputable experience, and market trend over cultural tradition. Then, too, there is simply the question of practical efficiency regarding these Fintech start-ups given the fact that they are still dependent upon established players for access. Thus, a major problem facing Fintechs is the development of a customer base that makes the business worthwhile. One company, Truewealth, an online wealth manager, had to agree to a deal with BLKB (Basellschaftliche Kantonalbank, the cantonal bank of Basel) to access a satisfactory client base. At Descartes Finance, another Swiss robo-adviser, cooperating with long-established asset managers is part of the business model. Additiv, a Fintech developer, has had to bring on board a well-known Swiss investor, Herr Martin Ebner, to help them finance their expansion.

Such caution is, indeed, the Swiss Way of Wealth and robo advisers in Switzerland will continue to face difficulties in acquiring assets, despite the low fees charged. Skepticism in the country about the reliability of the technology is still too strong and many potential investors have other priorities. 

As I have written here, what maintains a kind of “stealth” popularity in the country is that prized national product, the Private Banker (as prized as a beat-up alpine barn stocked with gold bars and lakes one can drink out of). Long the rock in the storm and refuge away from globaloney banking, the banquier privé is a special creature managed by partners with unlimited liability on their commercial and personal wealth for the bank’s obligations. For the Swiss, it expresses the idea of free enterprise, independence, personal service traditions, bean-counter competence and, most of all, long-term performance over short-term gains. It is world away from the tempest of “news” headlines, and private bankers have pretty much remained faithful to the values that have always guided them: in the endlessly shifting world of finance, they still do set the benchmark. Christian Rahn, of Rahn & Bodner, one such private banker, is a stalwart defender of this model of “integrity, stability and know-how” despite every possible pressure to go the full-blown robo-digital route.

"We have faced more problems in the past 263 years than what is going on now," said Herr Rahn of Rahn & Bodmer, a family bank founded in 1755, in an interview with the Swiss press at the height of the IRS investigations into Swiss accounts. "We will survive this well."



Categories: Current Affairs

Structure of Isaiah

Peter Leithart - Thu, 18/01/2018 - 11:00
The following summarizes the structural analysis of Isaiah found in David Dorsey’s Literary Structyre of the Old Testament. Like most of the books of the Old Testament, Dorsey finds that Isaiah is organized in a sevenfold pattern: A.Condemnation, pleading, promise of future restoration, 1:1-12: B. Oracles to the nations, 13:1-26:21 C. Woes, 27:1-35:10 D. Historical […]
Categories: People I don't know

An open letter to Christians who are using porn

The Good Book Company - Thu, 18/01/2018 - 09:41

My dear brother or sister.

You are finding it a struggle to even start reading this—because you are already a ball of conflicting emotions. 

You oscillate between thinking it’s not a big deal and knowing that you feel guilty and ashamed—and that your habit is consuming your soul.

You are fearful of admitting it, or being found out. And terrified of the disgust and sense of betrayal your friends, your spouse, your family will experience. To own up will change the way others see you. You imagine your reputation at church will be in ruins.

You’ve tried to stop—perhaps many times—but late at night, you flick to those channels or pick up that book  or tap in that web address and all your resolve has gone. You feel powerless to resist, and all alone. Fantasies come unbidden into your mind. The momentary thrills you receive are fleeting—the shame and sense of slavery are a constant backdrop to everyday life.

And you have tried to do the right things. You’ve prayed. Desperately. But nothing has changed. Or you have been too frightened to pray; fearing to come before God so dirty, broken and useless.

Sister. Brother. I want to tell you that, even if you feel there is none, there is hope. 

Our God is the God of hope. Jesus died for helpless slaves like you and me. And when you became a believer, you were born again into a living hope through the resurrection of Jesus Christ from the dead.

Remember the truth that, because of Christ, your heavenly Father is the lover of your soul, not the judge of your failure and weakness. 

Remember the promise of Jesus that he will not crush the weakest reed or put out a flickering candle, but says: "Come to me, all who labor and are heavy laden, and I will give you rest. Take my yoke upon you, and learn from me, for I am gentle and lowly in heart, and you will find rest for your souls". (Matthew 11 v 28-29)

Remember you were never called to walk with Christ alone. And you will need the help and support of your friends, family, yes, even your spouse, to get free from the grip of this silent sickness. 

Even if you feel there is none, there is hope. 

So can I urge you to do one small thing.

Tell someone.

Ask if you can talk in confidence to a trusted friend, or a leader in your church. And simply tell them that you have become trapped by the sin that is crouching at all of our doors. Ask for help to get your thinking straight about the goodness of our God-given sexuality and sexual desires. Ask for help to see how our fallen-ness has brought chaos and failure into our loves. Ask for help to see how all encompassing the grace of God is towards you in Jesus. Ask for help to see how God’s Holy Spirit can change you deep down.

Ask your friend to preach the love and grace of God to your wounded helpless heart.

And ask for practical help to see how you can change and grow.

Dear brother and sister. Please do not suffer in silence, or give up hope. Look for hope in the right place: the love the Father has for you, and the fellowship of friends he has given you to walk this painful path in the world.

 

To find out more about escaping the grip of porn, get hold of a copy of Vaughan Roberts' new book, The Porn Problem

Categories: Christian Resources

The Habitable Homes Bill is really about social housing

Adam Smith Institute - Thu, 18/01/2018 - 07:01

We thought this was an interesting comment on a proposed bill:

This coming Friday, 19 January, a bill is to be debated in parliament that could hugely improve the lives of many people in England.

The Homes (Fitness for Human Habitation) Bill would give private and social tenants the ability to take landlords to court if their home is unsafe. Over a millionhomes are thought to pose a serious threat to the health or safety of the people living there. This classification, also known as a “category 1 hazard”, covers 795,000 private tenancies – one in six of the privately rented homes in the country.

Ooooh, private landlords! How terrible they are. Why not use the righteous anger of the people against them? 

But then we get:

Although there are fewer of them, social tenants with an unsafe home currently have even less recourse, particularly where the landlord and the council are one and the same, as became tragically apparent following the Grenfell Tower fire last June.

Ah. 

So, we've those dastards in the private sector, facing competition plus independent (of them at least) enforcement of standards. We've then got the governmental (and quasi-such) sector with a monopoly supplier, no independent enforcement and thus consumers have even fewer rights and methods of gaining them.

Thus we must change the law. 

OK, the basic idea seems fair enough to us. Yes, that governmental supply should indeed be subject to the same consumer protections as the market sector. In fact, shouldn't this be true of the entire economy?   

Categories: Current Affairs

Beta: CloudLinux 6 kernel updated

CloudLinux - Thu, 18/01/2018 - 05:19

New updated CloudLinux 6 kernel version 2.6.32-896.16.1.lve1.4.51 is available for download from our updates-testing repository.

Changelog since kernel-2.6.32-896.16.1.lve1.4.50:

  • CKSIX-153: improved fix for Spectre Variant 1 attack.

To install a new kernel, please run the following command:

yum clean all --enablerepo=cloudlinux-updates-testing && yum install kernel-2.6.32-896.16.1.lve1.4.51.el6 --enablerepo=cloudlinux-updates-testing
Categories: Technology

Onward gay-sex missionaries marching carefully around Islamic hotspots

Anglican Ink - Thu, 18/01/2018 - 02:07

Jules Gomes offers his view on the formation of the Jayne Ozanne Foundation

Until You Can See the Marble Pattern of the Counter Through It

Blog & Mablog - Thu, 18/01/2018 - 02:00

“I would like to borrow a metaphor from Warfield and apply it to the phrase ‘the lordship of Christ.’ In the hands of liberals, the lordship of Christ is like pie dough—the farther you spread it, the thinner it gets” (Empires of Dirt, p. 124).

The post Until You Can See the Marble Pattern of the Counter Through It appeared first on Blog & Mablog.

Categories: People I don't know

Mission with young adults not as difficult as you think, says new research

Anglican Ink - Thu, 18/01/2018 - 01:40

"mission with young adults whilst challenging is not as difficult as one might think"

German Catholic bishop backs gay blessings

Anglican Ink - Thu, 18/01/2018 - 01:32

The Vice President of the German Catholic Bishops' Conference has called for a dialogue on blessing same-sex unions

Welby's criticism of George Bell "irresponsible and dangerous" say historians

Anglican Ink - Thu, 18/01/2018 - 00:44

(The text of a letter delivered to Lambeth Palace on 17 Jan 2018 written by seven leading historians of the 20th Century on the archbishop's criticisms of George Bell.)

Filibuster in Cuba, Part 1

Mises Institute - Wed, 17/01/2018 - 20:00
By: Chris Calton
 Season 2

When Cuban slave owners started to worry that Spain was going to emancipate their slaves, Narciso López thought that the time was ripe to start a revolution to overthrow Spanish rule. American expansionists hoping for the annexation of Cuba volunteered to help López, and many of these expansionists wanted to see Cuba turned into a new slave state.



Categories: Current Affairs

A Regulated Economy Leads to a Socialist Economy

Mises Institute - Wed, 17/01/2018 - 20:00
By: Ludwig von Mises
red tape.jpg

What is interventionism?

Interventionism means that the government does not restrict its activity to the preservation of order, or—as people used to say a hundred years ago—to “the production of security.” Interventionism means that the government wants to do more. It wants to interfere with market phenomena.

If one objects and says the government should not interfere with business, people very often answer: “But the government necessarily always interferes. If there are policemen on the street, the government interferes. It interferes with a robber looting a shop or it prevents a man from stealing a car.” But when dealing with interventionism and defining what is meant by interventionism, we are speaking about government interference with the market. (That the government and the police are expected to protect the citizens, which includes businessmen, and of course their employees, against attacks on the part of domestic or foreign gangsters, is in fact a normal, necessary expectation of any government. Such protection is not an intervention, for the government’s only legitimate function is, precisely, to produce security.)

What we have in mind when we talk about interventionism is the government’s desire to do more than prevent assaults and fraud. Interventionism means that the government not only fails to protect the smooth functioning of the market economy, but that it interferes with the various market phenomena; it interferes with prices, with wage rates, interest rates, and profits.

The government wants to interfere in order to force businessmen to conduct their affairs in a different way than they would have chosen if they had obeyed only the consumers. Thus, all the measures of interventionism by the government are directed toward restricting the supremacy of consumers. The government wants to arrogate to itself the power, or at least a part of the power, which, in the free market economy, is in the hands of the consumers.

Let us consider one example of interventionism, very popular in many countries and tried again and again by many governments, especially in times of inflation. I refer to price control.

Governments usually resort to price control when they have inflated the money supply and people have begun to complain about the resulting rise in prices. There are many famous historical examples of price control methods that failed, but I shall refer to only two of them because, in both these cases, the governments were really very energetic in enforcing or trying to enforce their price controls.

The first famous example is the case of the Roman Emperor Diocletian, very well-known as the last of those Roman emperors who persecuted the Christians. The Roman emperor in the second part of the third century had only one financial method, and this was currency debasement. In those primitive ages, before the invention of the printing press, even inflation was, let us say, primitive. It involved debasement of the coinage, especially the silver. The government mixed more and more copper into the silver until the color of the silver coins was changed and the weight was reduced considerably. The result of this coinage debasement and the associated increase in the quantity of money was an increase in prices, followed by an edict to control prices. And Roman emperors were not very mild when they enforced a law; they did not consider death too mild a punishment for a man who had asked for a higher price. They enforced price control, but they failed to maintain the society. The result was the disintegration of the Roman Empire and the system of the division of labor.

Then, 1500 years later, the same currency debasement took place during the French Revolution. But this time a different method was used. The technology for producing money was considerably improved. It was no longer necessary for the French to resort to debasement of the coinage: they had the printing press. And the printing press was very efficient. Again, the result was an unprecedented rise in prices. But in the French Revolution maximum prices were not enforced by the same method of capital punishment which the Emperor Diocletian had used. There had also been an improvement in the technique of killing citizens. You all remember the famous Doctor J. I. Guillotin (1738–1814), who advocated the use of the guillotine. Despite the guillotine the French also failed with their laws of maximum prices. When Robespierre himself was carted off to the guillotine the people shouted, “There goes the dirty Maximum.”

I wanted to mention this, because people often say: “What is needed in order to make price control effective and efficient is merely more brutality and more energy.” Now certainly, Diocletian was very brutal, and so was the French Revolution. Nevertheless, price control measures in both ages failed entirely.

Now let us analyze the reasons for this failure. The government hears people complain that the price of milk has gone up. And milk is certainly very important, especially for the rising generation, for children. Consequently, the government declares a maximum price for milk, a maximum price that is lower than the potential market price would be. Now the government says: “Certainly we have done everything needed in order to make it possible for poor parents to buy as much milk as they need to feed their children.”

But what happens? On the one hand, the lower price of milk increases the demand for milk; people who could not afford to buy milk at a higher price are now able to buy it at the lower price which the government has decreed. And on the other hand some of the producers, those producers of milk who are producing at the highest cost—that is, the marginal producers—are now suffering losses, because the price which the government has decreed is lower than their costs. This is the important point in the market economy. The private entrepreneur, the private producer, cannot take losses in the long run. And as he cannot take losses in milk, he restricts the production of milk for the market. He may sell some of his cows for the slaughter house, or instead of milk he may sell some products made out of milk, for instance sour cream, butter or cheese.

Thus the government’s interference with the price of milk will result in less milk than there was before, and at the same time there will be a greater demand. Some people who are prepared to pay the government-decreed price cannot buy it. Another result will be that anxious people will hurry to be first at the shops. They have to wait outside. The long lines of people waiting at shops always appear as a familiar phenomenon in a city in which the government has decreed maximum prices for commodities that the government considers as important. This has happened everywhere when the price of milk was controlled. This was always prognosticated by economists. Of course, only by sound economists, and their number is not very great.

But what is the result of the government’s price control? The government is disappointed. It wanted to increase the satisfaction of the milk drinkers. But actually it has dissatisfied them. Before the government interfered, milk was expensive, but people could buy it. Now there is only an insufficient quantity of milk available. Therefore, the total consumption of milk drops. The children are getting less milk, not more. The next measure to which the government now resorts, is rationing. But rationing only means that certain people are privileged and are getting milk while other people are not getting any at all. Who gets milk and who does not, of course, is always very arbitrarily determined. One order may determine, for example, that children under four years old should get milk, and that children over four years, or between the age of four and six should get only half the ration which children under four years receive.

Whatever the government does, the fact remains, there is only a smaller amount of milk available. Thus people are still more dissatisfied than they were before. Now the government asks the milk producers (because the government does not have enough imagination to find out for itself): “Why do you not produce the same amount of milk you produced before?” The government gets the answer: “We cannot do it, since the costs of production are higher than the maximum price which the government has established.” Now the government studies the costs of the various items of production, and it discovers one of the items is fodder.

“Oh,” says the government, “the same control we applied to milk we will now apply to fodder. We will determine a maximum price for fodder, and then you will be able to feed your cows at a lower price, at a lower expenditure. Then everything will be all right; you will be able to produce more milk and you will sell more milk.”

But what happens now? The same story repeats itself with fodder, and as you can understand, for the same reasons. The production of fodder drops and the government is again faced with a dilemma. So the government arranges new hearings, to find out what is wrong with fodder production. And it gets an explanation from the producers of fodder precisely like the one it got from the milk producers. So the government must go a step farther, since it does not want to abandon the principle of price control. It determines maximum prices for producers’ goods which are necessary for the production of fodder. And the same story happens again.

The government at the same time starts controlling not only milk, but also eggs, meat, and other necessities. And every time the government gets the same result, everywhere the consequence is the same. Once the government fixes a maximum price for consumer goods, it has to go farther back to producers’ goods, and limit the prices of the producers’ goods required for the production of the price-controlled consumer goods. And so the government, having started with only a few price controls, goes farther and farther back in the process of production, fixing maximum prices for all kinds of producers’ goods, including of course the price of labor, because without wage control, the government’s “cost control” would be meaningless.

Moreover, the government cannot limit its interference into the market to only those things which it views as vital necessities, like milk, butter, eggs, and meat. It must necessarily include luxury goods, because if it did not limit their prices, capital and labor would abandon the production of vital necessities and would turn to producing those things which the government considers unnecessary luxury goods. Thus, the isolated interference with one or a few prices of consumer goods always brings about effects—and this is important to realize—which are even less satisfactory than the conditions that prevailed before.

Before the government interfered, milk and eggs were expensive; after the government interfered they began to disappear from the market. The government considered those items to be so important that it interfered; it wanted to increase the quantity and improve the supply. The result was the opposite: the isolated interference brought about a condition which—from the point of view of the government—is even more undesirable than the previous state of affairs which the government wanted to alter. And as the government goes farther and farther, it will finally arrive at a point where all prices, all wage rates, all interest rates, in short everything in the whole economic system, is determined by the government. And this, clearly, is socialism.

What I have told you here, this schematic and theoretical explanation, is precisely what happened in those countries which tried to enforce a maximum price control, where governments were stubborn enough to go step by step until they came to the end. 

Excerpted from Economic Policy

Categories: Current Affairs

Bible - Critical - Multiple Vulnerabilities - SA-CONTRIB-2018-003

Drupal Contrib Security - Wed, 17/01/2018 - 18:46
Project: BibleDate: 2018-January-17Security risk: Critical 17∕25 AC:Basic/A:User/CI:Some/II:All/E:Proof/TD:AllVulnerability: Multiple Vulnerabilities Description: 

This module enables you to display a Bible on your website. Users can associate notes with a Bible version.

This module has a vulnerability that would allow an attacker to wipe out, update or read notes from other users with a carefully crafted title.

A user must have the "Access Bible content" privilege, which is most likely the default if you have enabled this module.

The code appeared to allow other SQL injection vulnerabilities as well. Many lines of code were rewritten to make this module more secure. Therefore, even if you did not give users the "Access Bible content" privilege, there may have been other SQL vulnerabilities which could have been exploited.

Solution: 

Install the latest version:

  • If you use the Bible module for Drupal 7.x, upgrade to Bible 7.x-1.7
Reported By: Fixed By: Coordinated By: 
Categories: Technology

Why California Has the Nation's Worst Poverty Rate

Mises Institute - Wed, 17/01/2018 - 17:30
By: Ryan McMaken
supp_poverty.PNG

Earlier this week, the LA Times reminded its readers that California has the highest poverty rate in the nation. 

Specifically, when using the Census Bureau's most recent" Supplemental Poverty Measure" (SPM), California clocks in with a poverty rate of 20 percent, which places it as worst in the nation.

To be sure, California is running quite closely with Florida and Louisiana, but we can certainly say that California is certain a top contender when it comes to poverty:

This continues to be something of a black eye for California politicians who imagine themselves to be the enlightened elite of North America. The fact that one in five Californians is below this poverty line doesn't exactly lend itself to crowing about the state's success in its various wars on poverty. 

Many conservative sites have seized on the information to say "I told you so" and claim this shows that "blue-state" policies fail. One should be careful with this, of course, since there are plenty of red states in the top ten as well. Moreover, some blue states, like Massachusetts, are doing moderately well by this measure:

In the realm of political punditry,  though, it matters a great deal whether one is using the regular poverty measure, or the SPM. For one, in the regular poverty measure, California ranks better than Texas, and leftists love to use the standard poverty rate to talk about how truly awful Texas and other red states are. The Supplemental Poverty Measure allows Texans to talk about how awful California is. 

If we're going to use census data to guess the prevalence of low-income households, though, the SPM is greatly superior to the old poverty rate. There's a reason, after all, that the Census Bureau developed it, and the Bureau has long warned that poverty rates using the old measure don't make for good comparisons across state lines. 

The old poverty measure was a far more crude measure that did not take local costs into account, did not include poverty-assistance income, and basically ignored what can be immense differences in the cost of living in different locations. Many commentators often love to note how the median household income in many red states are below the national average — but then conveniently ignore how low the cost of living is in those places. 

The SPM, on the other hand, takes into account the costs of "food, clothing, shelter, and utilities, and a small additional amount to allow for other needs" It includes government benefits, but also subtracts taxes. (A full explanation is here.) 

The end result shouldn't really be all that surprising: once we take into account the actual cost of living, including taxes, we find that poverty is actually quite high in California. 

How to Alleviate Poverty 

There are only two ways to reduce poverty and increase the standard of living:

  • Increase household income
  • Lower the cost of living

Poverty can be alleviated by simply increasing income. Or it can be done by simply reducing the cost of living. Ideally, both things happen at once, and fortunately, that's usually how it works. 

The greatest reductions in global poverty have come about due to the spread of capital and industrial production methods. This is because better and more widespread use of capital leads to two things:

1. It increases household income by increasing worker productivity. That is, each worker can produce more stuff of higher value. This means each worker can take home a higher income. 

2. When we produce more stuff more quickly, that stuff becomes more affordable. Thanks to labor-saving and more efficient machinery, for example, fewer people can make more cars more quickly. In turn, more people can afford more cars because cars are more plentiful, and less expensive. 

Over time, more people can buy more stuff at lower prices, thus increasing their standard of living. Even better, thanks to modern capital, those people can also produce more during the hours they work, making it possible to buy even more stuff. Both pieces work together to increase living standards. 

One of the biggest problems California is facing right now, though, is that government interventions in the marketplace are making it harder and harder to produce more stuff, thus driving up prices. 

The end result is a higher cost of living, and thus more poverty. Kerry Jackson at The LA Times notes: 

Further contributing to the poverty problem is California’s housing crisis. More than four in 10 households spent more than 30% of their income on housing in 2015. A shortage of available units has driven prices ever higher, far above income increases. And that shortage is a direct outgrowth of misguided policies.

“Counties and local governments have imposed restrictive land-use regulations that drove up the price of land and dwellings,” explains analyst Wendell Cox. “Middle-income households have been forced to accept lower standards of living while the less fortunate have been driven into poverty by the high cost of housing.” The California Environmental Quality Act, passed in 1971, is one example; it can add $1 million to the cost of completing a housing development, says Todd Williams, an Oakland attorney who chairs the Wendel Rosen Black & Dean land-use group. CEQA costs have been known to shut down entire homebuilding projects. CEQA reform would help increase housing supply, but there’s no real movement to change the law.

Extensive environmental regulations aimed at reducing carbon dioxide emissions make energy more expensive, also hurting the poor. By some estimates, California energy costs are as much as 50% higher than the national average. Jonathan A. Lesser of Continental Economics, author of a 2015 Manhattan Institute study, “Less Carbon, Higher Prices,” found that “in 2012, nearly 1 million California households faced … energy expenditures exceeding 10% of household income. In certain California counties, the rate of energy poverty was as high as 15% of all households.” A Pacific Research Institute study by Wayne Winegarden found that the rate could exceed 17% of median income in some areas.

It is increasingly becoming common knowledge that California is notoriously bad in terms of the cost of housing. 

Every time a new "top ten" list of least-affordable housing markets is published, California cities often dominate the top of the list. In this list, for example, San Francisco, Los Angeles, San Jose, and San Diego are all in the top ten. 

Housing is perhaps the poster child for the impossibility of getting ahead in California. Much of this is due to locally-based NIMBYism in which local governments actively intervene to reduce new housing construction for the sake of "preserving the character" of the neighborhoods. This is just another way of sawing: "rich people like things the way they are, so you poor people can just get lost. We're not building any more housing." 

These same rich people then later pat themselves on the back for voting Democratic and "doing something" about poverty. 

But it's not all just local regulations. As Jackson notes, environmental regulations are especially burdensome on businesses, thus driving up the cost of everything. This is especially true of housing which requires land, water resources, and visibly impacts the local environment. 

These regulations, mind you, are all imposed on top of already existing federal regulations, and in addition to the environmental regulations that already function with a lower burden to business in other states. Coloradans, for example, aren't exactly living in rivers of toxic sludge, in spite of having fewer environmental regulations — and cheaper housing. 

Nor is housing the only industry impacted by these regulations. Mountains of anti-business regulations in the state also make it harder to start new businesses, hire people, and cover the basic costs of expanding worker productivity. Fewer workers get hired. Less capital is deployed to workers. The end result is that worker productivity growth can't keep  up with increases in the cost of living. Poverty results. 

Recognizing this vise in which the poor are caught in California, the response is always the same: more rent control, more regulations, more more costly hoops for employers to jump through. 

"We're taming capitalism!" the politicians tell themselves. Unfortunately, they've driven a fifth of the population into poverty in the process. 

But don't expect things to improve for the poor in California any time soon. California is perhaps the single biggest example in the US of how stylish locales become playgrounds for the rich, and a treadmill to nowhere for everyone else. 

In recent years, news outlets have carried a number of articles on how workers in silicon valley are living in their cars. Sometimes, the homeless even have jobs at the big tech firms like Facebook. Nearly all of these homeless people have jobs of some sort, though. Thanks to the ruling classes of California, though, a basic apartment is $3,000 per month, while food and gasoline aren't exactly cheap. 

The well-to-do tell themselves that the high cost of living is simply "the cost of doing business" for living in such a wonderful place with so many enlightened, intelligent, and beautiful people. People can go to the beach whenever they want, and life is wonderful. 

Of course, anyone who has actually lived in California as a non-wealthy person knows that one most certainly can't go to the beach "whenever you want." If one is working two jobs to pay the rent, a day at the beach — after sitting in traffic and paying for parking — isn't exactly a regular event. Moreover, the communities with non-sky-high rents are generally found well inland, and aren't exactly next to Malibu. 

This may help explain why, as the Sacramento Bee reported last year, California is exporting its poor to Texas. The beaches aren't as nice in Texas, but many of these migrants are trading in the beaches — which they never see anyway — for an affordable apartment. 



Categories: Current Affairs

A Royal Commission on the NHS

Adam Smith Institute - Wed, 17/01/2018 - 17:05

Lord Saatchi and Dominic Nutt last week published a remit for a Royal Commission on the NHS. The NHS staggers from crisis to crisis. The demand for a non-political party strategic review, first proposed by Norman Lamb, then shadow Health Minister for the LibDems, is growing. About 100 MPs agree.  We supported the idea some months back. The government does not (yet) agree.

The Saatchi paper was cited by Dr Andrew Murrison who also pushed for a Royal Commission in a PMQ on 10th January.  The Prime Minister briefly dismissed the proposal with disdain:  any problems the NHS may have need immediate attention, rather than awaiting a Royal Commission.  The party line is that everything is fine because the Department of Health now has a Plan.  The PM’s response is flawed: a strategic review would not inhibit any improvements that can now be made.

The only previous NHS Royal Commission was set up by Harold Wilson in 1975.  Little has changed.  Waiting times at A&E were one concern, excessive bureaucracy and layers of management were two others. Some of the reports’ conclusions are familiar: “The development of nursing homes could make a major contribution to the care of the elderly.” (22.32) Mental health needs to “be integrated fully into a unified psychiatric service, and to receive a proper share of capital monies.” (22.34)  “22.35 Finally, we concluded that communications between the hospital and the community services were not all that they should be, and that the arrangements for community workers to work in hospitals, and hospital workers in the community needed to be improved. Strong links were particularly important in the rehabilitation services.” Many of the 58 recommendations, such as compulsory seat belts, were successfully implemented but many others still, 39 years on, need to be.

That report focused on effectiveness, i.e. NHS value for money, but even so, it took four years to produce.  In contrast, the Saatchi remit covers almost everything conceivable: “The aim should be to produce a fully costed blueprint that delivers the best possible outcomes over the coming decades at the lowest cost.”  Actually: “a series of options for implementing its central ideas, each of them fully costed”. Identifying and measuring “the best possible outcomes” would be ambitious, never mind quantifying all the alternative cost implications.  Health inequalities between different parts of the country and of society would be corrected along the way. “The Royal Commission would also be tasked with investigating a range of other issues, including the gap in health outcomes between rich and poor, and between Britain and other countries; the ageing population; the pace and cost of medical innovation; the need to integrate social and long-term care with health care; the case for and against greater private sector involvement in the delivery of health care; the tensions between privacy and better use of health data; and potential additional sources of revenue for the NHS to complement general taxation.”  And “Mental health provision must also be considered, including as a chronic, public health issue which causes, and can be caused by, poverty.”

This brief, to cure the NHS, adult social care, mental problems and poverty, would, if it were feasible at all, require many more years than the four of the 1979 report.  Can the NHS afford to wait that long?

Norman Lamb’s proposal that a non-political party “convention” be tasked to report back within a year is realistic and practical.  Like the 1979 report, it should focus on value for money, i.e. minimising waste, and leave the issue of the quantum, i.e. the total amount of the departmental budget, the nation can afford to the Chancellor.  This would make the convention far more acceptable to a government tired of being lectured about increasing NHS spending.

There is plenty to go for.  The 2016/7 Department of Health accounts (p.119) shows that, of the total Department of Health £139bn. expenditure, £106.9bn. (77%) went to NHS England, £2bn of which was passed on to Local Authorities for social care (unblocking beds).  The vast majority of adult social care is funded by the Department for Communities and Local Government (as it was until January 2018).  Only about £100bn. (72%) reached the front line, i.e. treating medical and dentistry patients, medicines and devices.  5% of the departmental total (£7.2bn) went on quangos – some necessary, some not.

Comprehensive as the Saatchi remit is, a few areas seem to be been overlooked such as:

Restraining demand through co-payments, as already apply to dentistry and prescriptions, and redrawing the boundary of what the NHS should provide.

How to push treatment back from higher cost (acute hospitals) to lower cost provision (towards cottage hospitals or home).

The impact on management hierarchies of closer integration of the (vertical) NHS and (local/lateral) adult social services.  The latter is less top heavy and may be the better model.  For example, the widely discredited CCGs could be eliminated at a saving of £1bn. p.a. of managerial costs.

We are short of GPs, geriatricians and nurses essentially because people don’t want those careers.  Paying them more would help up to a point but the problem has more to do with job satisfaction. Less bureaucracy, interference, record keeping and more trust would help.

A Royal Commission is attractive but, sadly, not feasible.  The Saatchi remit would take too long, cost too much and arrive on the desk of a new government which is unlikely to share the perspective of the current team.  We need decisions before the next election.  On the other hand, the Lamb proposal of a non-political party, year-long convention is something the government should rush to accept.  If it does not, the House of Commons Health Committee should commission it as it is perfectly entitled to do. 

Categories: Current Affairs

Pages

Subscribe to oakleys.org.uk aggregator
Additional Terms