Archive for Methodologies
Oxford University scientists have come up with a new approach that turns functional magnetic resonance imaging (fMRI) into a full numerical measure of how the brain is working.
While the diagnostic potential and operations potential of this type of methodological technological development are the obvious, imagine the potential to manipulate growing data sets of brain processes to determine the outcomes of an enormous set of scenarios as a result of the #BigData that would exists on the likenesses and differences of human and other brain. We are truly embarking on a new era of understanding because of our compounding of technological (methodological, software, & hardware) developments.
Measurement results from a single slice from a representative subject using the new fMRI technique (credit: D.P. Bulte et al./NeuroImage)
I got this tweet today, as a part of a larger conversation that technological breakthroughs could help predict disruptive economic times. During the past 10 days or so, the US and global financial markets have taken a deep plung, as a result of, well, according to the CIOs (chief investment officers) and politicians, we don’t know. The new industry pins the almost unanimous economic decisions of sell sell sell, to the latest new is geo-political interactions and/or financial specific news.
We see headlines like “Downgrade Ignites a Global Selloff” at the Wall Street Journal, referring to the Standard & Poor’s downgraded credit rating of US treasuries, which by the way soared during the selloff of equities, because of their relative strength.
More importantly than what to buy, none of the headlines, nor the vague analysis captures the actual root-cause of this regular, or rather, irregular economic downturn occurring over the past decade. The general ideal that one should be able to buy low and sell high that once held true in the 20th century no longer exists. The root of the problem is in our use of technologies to error-proof redundant problems in the modern work world. Further, we know that mostly all errors exist by the hands of humans. Thus, error-proofing can be synonymous with human-proofing. We usually think of technologies that replace human activity as a device or software…”the robots”, and those do exist, but they are less of a threat than the methodological technologies.
We rarely think of a routine as a technology, but they are. Benchmarking is a technology. With all of the methodological expertise being poured into corporations over the past 30 years, we’ve finally got somewhere, efficient. How many times have you heard that word at the office? Since the late 1960s and the creation of Poka Yoke by Dr. Shigeo Shingo, and on to Lean-Manufacturing, and Six-Sigma, and most recently the 3rd version of IT Infrastructure Library (ITILv3), we are actively depleting the work force to ensure our qualitative (effectiveness) and quantitative (efficiency) superiority to the competition.
Its a difficult dialogue to have, because a valid argument is: what’s wrong with business being efficient? My answer would be: Nothing at all. The problem comes into play when human-kind has rendered its ability to distribute value, obsolete. In the past we’ve distributed value through a currency of some sort, and that currency (in primitive times and modern day) is backed by more than gold or bonds, it is also backed by faith in a philosophical system that a woman/man get paid for an “honest days work”, quite the primitive slogan. In a knowledge economy where people aren’t performing back-breaking work at the volumes that they used to, and 10 knowledge workers of the 1980’s can be performed by 1 Project Manager using 30 years of benchmarked data with soft/hardware help, it’s difficult to spread the wealth that we once did.
When the markets sell off equities into cash, they are saying that the economy is inflated and weak. There are no buyers for the products being produced, because there are no jobs. There are no jobs, because of all of the error-proofing that proceeded them; and finally, it is exceedingly difficult to quantify what people’s knowledge, experience, existence is worth in the old paradigm. While it feels better to point the finger at the CEOs and Politicians today, of which I’d likely get a finger or two, the problem is that we are trying to distribute the wealth that still exists using an antiquated model.
If one looks at the M1&M2 numbers at the US Federal Reserve, they’ll notice that all of the money we need to fix/build anything still exists. This is the same across the globe. When the news says that money supply is lower, what they actually mean is that money distribution is lower, because the money supply, as the link shows is rarely diminished. As an economy retracts, funds return to its originator. The wealthiest of our species cannot justify how to spread a trillion dollars around, at the moment, because there fewer and fewer tasks to assign a wage and a human resource. I’ve got a few solutions to recommend in my next book project, Integrationalism: Essays on ownership and distributing value in the 21st century.