「華人戴明學院」是戴明哲學的學習共同體 ,致力於淵博型智識系統的研究、推廣和運用。 The purpose of this blog is to advance the ideas and ideals of W. Edwards Deming.

2016年8月12日 星期五

Europe's biggest software maker SAP ditches annual reviews

Germany's SAP (SAPG.DE), maker of software used to grade the performance of millions of employees worldwide, is ditching its own annual performance reviews as too expensive, time-consuming and often demotivating.
WWW.REUTERS.COM|由 HARRO TEN WOLDE AND ILONA WISSENBACH 上傳

2016年8月11日 星期四

『台灣戴明圈』(A Taiwanese Deming Circle)的 circle 在英文史有數百年了

本blog自稱『台灣戴明圈』(A Taiwanese Deming Circle)

其中circle的想法, 源自quality circle,
其實它在英文史有數百年了

The Circle of the National Gallery of Art
National Gallery of Art
Thomas Cole is generally considered America's first important landscape painter. He made his first trip to Europe as an artist in 1829. While in in Italy in 1831-1832, Cole saw and sketched many scenes of circular, ruined towers set in lonely landscapes.
For "Italian Coast Scene with Ruined Tower," Cole set out to fulfill a commission for a scene from Byron's narrative poem, "The Corsair." Encountering difficulties with that subject, he shifted to a different source, Coleridge's introduction to "The Ballad of the Dark Ladie." Neither provided much inspiration; as Cole struggled to bring the painting to completion, he was beset by doubts and his mood became troubled. Cole chose to abandon his poetic sources and made the picture into something purely his own.
Thomas Cole, "Italian Coast Scene with Ruined Tower," 1838, oil on canvas, National Gallery of Art, Washington, Gift of The Circle of the National Gallery of Art





 circle 在個世紀的增義

十七世紀中期
某社會層級

coterie 

Pronunciation: /ˈkəʊt(ə)ri/ 

NOUN (plural coteries)

A small group of people with shared interests or tastes, especially one that is exclusive of other people:a coterie of friends and advisers

Origin

Early 18th century: from French, earlier denoting an association of tenants, based on Middle Low German kote 'cote'.

十八世紀早期
同好者組織
A group of people with a shared profession, interests, or acquaintances:she did not normally move in such exalted circles
十九世紀通靈者之聚會
seancePronunciation: /ˈseɪɒns/   Pronunciation: /ˈseɪɒ̃s/  Pronunciation: /ˈseɪɑːns/ NOUN
A meeting at which people attempt to make contact with the dead, especially through the agency of a medium.Example sentencesOrigin
Late 18th century: French séance, from Old French seoir, from Latin sedere 'sit'.

二十世紀品管圈等



quality circle




NOUN

A group of employees who meet regularly to consider ways of resolving problems and improving production in their organization.

2016年8月10日 星期三

Low wages are both a cause and a consequence of low productivity. US economy: The productivity puzzle



American productivity is still in a rut, having fallen for three straight quarters. How can more be done with less? From the archive



Free exchange: doing less with more
ECONOMIST.COM

June 29, 2014 5:23 pm

US economy: The productivity puzzle

Long-term prosperity depends on the capacity of every American to increase output constantly. Can they?
B8WBY3 Three combines harvesting rows of swathed wheat in farmers field aerial view near Colgate North Dakota©Alamy
Bumper crop: the success of US farmers in increasing their productivity is a model that some, such as MIT’s Erik Brynjolfsson, think can be replicated
To glimpse the miracle of productivity growth there is nowhere better to look than the bountiful fields of the US Corn Belt. A hundred years ago, an army of farmers toiled to produce 30 bushels an acre; now only a few hands are needed to produce 160 bushels from the same land.
The rise of modern civilisation rested on this trend: for each person to produce ever more. For the past 120 years, as if bound by some inexorable law, output per head of population increased by about 2 per cent a year. That is, until now.

More

ON THIS STORY

IN ANALYSIS

There is a fear – voiced by credible economists such as Robert Gordon of Northwestern University – that 2 per cent is no law but a wave that has already run its course. According to Prof Gordon’s analysis, 2 per cent could easily become 1 per cent or even less, for the next 120 years.
The US Federal Reserve is already edging down its forecasts for long-term interest rates. “The most likely reason for that is there has been some slight decline . . .  of projections pertaining to longer-term growth,” said Janet Yellen, chairwoman, at her most recent press conference.
Yet there are also techno-optimists, such as Erik Brynjolfsson and Andrew McAfee of the Massachusetts Institute of Technology, whose faith in new discoveriesis such that they expect growth to accelerate, not decline.
Then there are more phlegmatic economists, whose answers are less exciting but also less speculative – and come in a bit below 2 per cent for growth in output per head.
The productivity question is of the greatest possible consequence for the US economy, affecting everything from when interest rates should rise to where they should peak, from the sustainability of US debt to what is the wisest level of investment for every business in the country.
The answer depends on companies such as Climate Corporation, which fights the battle for agricultural productivity growth from its new front line, in the office buildings of Silicon Valley.
Climate Corporation, which was bought last year by Monsanto for $930m, works on “precision agriculture”, bringing the power of data science to bear on farming.
For example, the company says that by combining fertiliser use, soil type, weather data and other information in a single database, it can tell farmers exactly how much nitrogen is in a field and thus how much fertiliser they need to apply.
The boost to yields could be as much as 5 per cent – and that is just the start. “We’ve identified about 40 different decisions a farmer makes where there’s potential to apply data science,” says Anthony Osborne, the company’s head of marketing.
Whether computers can keep making broad contributions to productivity is one of the most important immediate issues.
But it is not the only issue. Growth in gross domestic product, the familiar statistic by which all economies are measured, can come about in several ways: more workers, with better skills; more capital such as factories, roads and machines, or new technology. Leaving aside the latter category, the consensus among economists is that most of these will not contribute as much to economic growth as they have in the past.
To start with, US population growth is at its lowest since the 1930s, having fallen from about 1.2 per cent a year in the 1990s to 0.7 per cent in recent years. This does not affect growth in living standards – it means fewer consumers as well as fewer workers – but adding less extra labour will slow the headline GDP growth rate that the Fed worries about.
On top of that, demographics will also slow growth in GDP per capita, which does affect living standards. Ageing will mean fewer active workers per head of population; most women have now joined the US labour force so that source of extra workers is running out.
Prof Gordon estimates that demographics could knock 0.3 percentage points off the long-run trend of 2 per cent growth.
“Everybody is pretty much in agreement in expecting slower growth in hours worked relative to what we’ve seen in the last 50 years,” says John Fernald, a senior research adviser at the Federal Reserve Bank of San Francisco.
The truest measure of economic progress, though, is the growth of GDP per hour worked. For every hour of human toil, how much is created? Here too, some factors that drove growth in the past are weakening, such as skills and education.
The expansion of primary, then secondary and then college education has helped the economy grow for generations, but average years of education have now reached a plateau. “The US is slipping back in the league tables of college completion and high school completion,” says Prof Gordon, suggesting this will account for another 0.2 percentage points off per capita growth.
That leaves technology. “I agree with much of what he says about the slowing demographics,” says Prof Brynjolfsson. “Where he and I differ is prospects for future innovation.”
Growth in GDP per hour worked depicts an interesting pattern over time. According to Prof Gordon, at a rate of 2.4 per cent, it was fast from the 19th century until 1972. It then slowed to 1.4 per cent a year until 1996.
The internet boom pushed the rate up to 2.6 per cent – it was this period that led Alan Greenspan, former Fed chairman, to talk about a “productivity feast” – but by 2004, well before the financial crisis, the surge was over. Since 2004, barring a measurement problem, growth in output per hour has been 1.3 per cent.
Whether it’s robotics or software for knowledge work, if you take the labour input to zero you get a pretty astronomical productivity number
The dispute is this: in the coming decades, should we expect growth like that which we experienced from 1996 to 2004, at 2.5 per cent, or like the period since 2004, of 1.3 per cent? While Prof Brynjolfsson has Star Trekvisions of utopian technological progress, Prof Gordon is more of a cyberpunk, imagining a world in which the computers may become more powerful but living standards for average humans improve only slowly.
Computation is the root of Prof Brynjolfsson’s optimism: his book with Prof McAfee is called The Second Machine Age and argues that the impact of information technology has only just begun to be realised. Exponential expansion in computing power, and the ability to diffuse innovations rapidly, could mean growth like that of the late 1990s.
“The reason I’m optimistic is that I don’t rely primarily on extrapolating past economic trends,” says Prof Brynjolfsson. After visiting labs, he says, “I just come away astonished at what’s in the pipeline. Most of it has not yet reached commercialisation.”
Rather than referring to historical data, he points to Google’s self-driving car, to the potential for computer systems that diagnose disease and answer legal queries, and the growing flexibility of robotics. Such automation will free up a host of labour for new tasks, just as other innovations did in the past. “Whether it’s robotics or software for knowledge work, if you take the labour input to zero you get a pretty astronomical productivity number,” he says.
By contrast, Prof Gordon expects a lower pace of productivity growth, perhaps in line with that achieved in the past 10 years. To hit even that target, he points out, means keeping up a steady stream of new creations such as smartphones.
The heart of his argument is that the discoveries of the past – running water, the internal combustion engine, the electric lightbulb – were simply more important than those of today. From 1870 to 1972, he points out, American homes went from lightless, isolated places of drudgery to buildings of air-conditioned comfort, with a dishwasher in the kitchen and a car in the garage.
Think of every employee you’ve had contact with in the last two or three days, and think, is that person going to be replaced by a robot in the next 20 years?
Prof Gordon is also dismissive of the potential productivity gains from inventions such as driverless cars: being able to answer email instead of turning the steering-wheel, for example. “The real productivity gains would presumably come from driverless trucks,” he says, but then points out that a UPS delivery van would still need a driver to remove the parcels from the vehicle.
He is more impressed by the potential of robotics but less convinced the moment has arrived when they are sufficiently powerful to supplant humans. “Think of every employee you’ve had contact with in the last two or three days, and think, is that person going to be replaced by a robot in the next 20 years?”
One curious aspect of both professors’ arguments is how uneconomic they are. Their focus is more about what is left to discover than the economy’s ability to make those discoveries. Prof Gordon’s approach would struggle to explain the 1996 acceleration in productivity growth, while Prof Brynjolfsson’s has little to say about the slowdown after 2004.
Yet economics has quite a lot to say about the process of making discoveries, based on the less than revolutionary insight that breakthroughs depend on the effort put into researching them.
In a recent study, Mr Fernald and Charles Jones of Stanford University break down the inexorable 2 per cent growth in US output per person from 1950 to 2007 in a different way. They find almost none of it comes from more capital per worker.
About 0.4 percentage points comes from human capital (better education). But by far the largest contribution – 1.6 percentage points of the total – comes from the fact that more people are working on research and development.
To sustain that magical run of 2 per cent growth in output per person, the US may need more Silicon Valleys to emerge in China and India
In part, that is because there are more people (and thus more bodies to do research). But mainly it is the result of devoting a steadily larger portion of the total population to work on research and development.
This analysis allows for a more grounded forecast than speculation about what technologies remain to invent. “The pessimistic part of that equation for the future is human capital,” notes Mr Fernald, as the contribution from better education is petering out. It is also impossible for the US to keep devoting ever more of its population to working in research and development.
But this is not true of the world as a whole. Huge populations in China, India and elsewhere are joining the global economy, improving their education systems and putting more researchers to work at the scientific frontier. Any discoveries they make can be used in the US as easily as anywhere else.

FT Video


June 2014: James Mackintosh on why there are reasons to be positive about weak US GDP figures, which, he says, reflect problems with snowfall rather than fundamentals.
In that case, the improvements that come from scientific discovery may be sustainable. Productivity growth need only slow to about 1.6 per cent. Add in some modest increase in population and the economy as a whole could expand at 2 per cent per year or a little more. Mr Fernald’s long-run forecast is 2.1 per cent. This suggests that the Fed open market committee’s latest projection of 2.2 per cent is not far off.
Climate Corporation shows how innovative the US still is – and how computers can yet boost productivity in unexpected ways. To sustain that magical run of 2 per cent growth in output per person, however, the US may need more Silicon Valleys to emerge in China and India, and add their heft to the eternal pursuit of another bushel of corn from the same acre of land.
Intel, clock speed and the measurement of productivity growth
Is the recent slowdown in productivity growth nothing but a statistical mirage? A recent study by economists David Byrne, Stephen Oliner and Daniel Sichel notes a fascinating discrepancy between price and performance data for microprocessors (see chart above). This is important because the rapid progress of processing power is what drives the technology revolution.
Moore’s Law – the trend identified by Intel co-founder Gordon Moore that computer power doubles every two years – has continued apace. But at the same time, whereas the measured price of computing power was falling at a rate of 70 per cent a year between 1998 and 2000, the pace of decline more recently has slowed to 3 or 4 per cent. That translates into a slower pace of measured productivity growth.
Mr Oliner, currently at the American Enterprise Institute, a Washington think-tank, has a few ideas about what may be happening. One is an increase in Intel’s market power. “Starting in about 2006, which is when the break occurred, Intel really solidified its market position relative to AMD,” its main competitor, he says. Less competition may mean slower price declines for its older products.
In about 2006, Mr Oliner continues, “Intel itself had a major breakthrough and developed multi-core chips.” Instead of driving up “clock speed”, the most familiar way of measuring the processing speed of a chip using megahertz or gigahertz, it started including multiple copies of the basic processor within the same chip. If computing power were still measured using clock speed, however, the pace of improvement would appear to suddenly decline.
The US Bureau of Labor Statistics uses a range of tools to measure computing power. One argument in favour of its data – which suggests the pace of progress in computer chips has slowed massively – is that consumers seem to be replacing their desktops less frequently.

生產率增長之謎


要一窺生產率增長的奇跡,沒有比美國玉米產區的高產田地更好的地方了。100年前,一大群農民辛苦勞作,產出只有30蒲式耳/英畝,現在,區區幾個人就能在同樣的一英畝土地上得到160蒲式耳的收成。
現代文明的興起建立在人均產出越來越高的趨勢之上。過去120年來人類似乎遵從著某種必然規律,人均產出每年增長約2%。但最近十幾年情況似乎發生了變化。
美國西北大學(Northwestern University)的羅伯特•戈登(Robert Gordon)等值得信賴的經濟學家擔心,2%不是什麽規律,而是一個正在趨近終點的階段。按照戈登的分析,今後120年2%的增長率可能很容易變成1%,甚至更低。
美聯儲(Fed)已經在調低對長期利率的預測。美聯儲主席珍妮特•耶倫(Janet Yellen)在其最近的一場新聞發布會上表示:“最有可能的原因是,與較長期增長有關的預測……出現了一些小幅下降。”
然而也有一些信奉科技的樂觀主義者,比如麻省理工學院(MIT)的埃里克•布林約爾松(Erik Brynjolfsson)和安德魯•麥卡菲(Andrew McAfee),他們對新發現抱有極大的信心,以至於他們預計增長將會加速,而非下降。
還有就是一些比較含蓄的經濟學家,他們的答案不那麽令人興奮,與歷史均值差別不大,他們認為人均產出增長將會略低於2%。
生產率問題有可能給美國經濟帶來最為重大的後果,涉及方方面面:從何時應當上調利率,到何時應當停止加息;從美國債務的可持續性,到什麽是美國企業最明智的投資水平。
答案取決於Climate Corporation等公司。該公司正從新的前沿陣地——硅谷辦公大樓——投入推動農業生產率增長的戰鬥。
Climate Corporation去年被孟山都(Monsanto)斥資9.3億美元收購。該公司致力於“精細農業”,讓數據科學的威力助推農業。
例如,該公司表示,通過將化肥使用、土壤類型、天氣數據以及其他信息整合進一個數據庫中,農民們就能確切地知道地里的氮含量,從而知道需要施用多少肥料。
收成增幅可能高達5%,而這僅僅是開始。Climate Corporation營銷總監安東尼•奧斯本(Anthony Osborne)表示:“我們發現,一個農民做出的大約40個不同決定有可能用到數據科學。”
計算機能否持續對生產率做出廣泛貢獻,是當前最重要的議題之一。
但它並非是唯一議題。國內生產總值(GDP)增長(人們熟知的衡量所有經濟體的統計數據)可以通過多種途徑實現:增加勞動力並讓他們擁有更高的技能;增加工廠、道路和機械等資本,或者新技術。除了新技術外,經濟學家們的共識是,大多數因素對經濟增長的貢獻將不如以往。
首先,美國人口增速處於上世紀30年代以來的最低點,從上世紀90年代的每年約1.2%,降至近年的0.7%。這沒有影響生活水準的上升(因為消費者和工人都更少了),但勞動力人口放緩增長將導致整體GDP增長率放緩,而這正是美聯儲擔心的。
除此之外,人口結構因素也將導致人均GDP增長放緩,這一點確實影響到生活水準。人口老齡化將意味著活躍勞動力占總人口的比例下降;多數女性已加入美國勞動者大軍,這意味著勞動力增長源泉即將耗盡。
戈登估計,人口結構因素可能導致2%的長期增長趨勢下降0.3個百分點。
舊金山聯邦儲備銀行(Federal Reserve Bank of San Francisco)高級研究顧問約翰•弗納爾德(John Fernald)表示:“大家基本上都認為,與過去50年相比,工作時間的增長將會放緩。”
然而,衡量經濟進步的最可靠指標是每小時工作實現的GDP的增長。人類每小時的勞作,創造了多少產出?同樣,過去推動增長的某些因素正在減弱,比如技能和教育。
先是小學、接著是中學乃至大學教育的普及,在幾代人時間里推動著經濟增長,但現在平均受教育年數的增長已經停滯。戈登表示:“在大學和高中學業完成率的排行榜上,美國的排名正在回落。”他表示,這又會拖累人均GDP增長下降0.2個百分點。
只剩下了技術。麻省理工學院的布林約爾松表示:“我基本上同意他所說的人口增長放緩的因素,我和他的分歧在於對未來創新前景的看法。”
從長期看,每小時工作實現的GDP的增長呈現出一種有趣的模式。按照戈登的說法,從19世紀一直到1972年,生產率增長較快,年均達到2.4%,隨後從1972年到1996年放緩至1.4%。
互聯網繁榮推動生產率年均增幅回升至2.6%,正是這段時期促使時任美聯儲主席艾倫•格林斯潘(Alan Greenspan)談論“生產率盛宴”。但早在金融危機爆發之前的2004年,這場盛宴就結束了。拋開測算方面的問題,自2004年以來,每小時產值增長率只有1.3%。
爭議的焦點是:我們應該預期今後幾十年的生產率增長是像1996年到2004年那樣達到2.5%,還是像2004年以來那樣僅為1.3%?盡管布林約爾松對烏托邦的技術進步有著“星際迷航”(Star Trek)式的憧憬,但戈登更像是賽博朋克(cyberpunk),在他設想的世界中,計算機可能變得更加強大,但普通人生活水準只是緩慢上升。
電腦運算是布林約爾松樂觀的根源:他曾與邁克菲合著《第二個機器時代》(The Second Machine Age),該書提出,信息技術的影響才剛剛開始展現。計算能力的指數級提高,以及快速傳播創新的能力,可能意味著20世紀90年代末的增長再度到來。
布林約爾松說:“我之所以樂觀,是因為我並不主要依賴外推以往經濟趨勢。”他在參觀實驗室後表示:“我只是對正在研發的東西感到驚訝。它們大多數還沒進入商業化階段。”
他不喜歡引用歷史數據,而是提到了谷歌(Google)的無人駕駛汽車,計算機系統診斷疾病和回答法律咨詢的潛力,以及機器人技術不斷提高的靈活性。跟昔日的其他科技創新一樣,此類自動化將解放大量勞動力,使他們能夠投入新的任務。他說:“無論是機器人還是知識工作軟件,如果你把勞動力成本降低至零,生產率將達到天文數字。”
與此形成反差的是,戈登教授預計生產率增長將會減速,或許與過去10年的增速相符。他指出,即使是要達到這一目標,也意味著得不斷推出像智能手機這樣的創新產品。
他的中心論點是,過去的發現,如自來水、內燃機、電燈泡等,遠比今天的發現重要。他指出,從1870年到1972年,美國人的居家環境經歷了巨大變遷,從沒有電燈、出行不便、自己做家務活,到室內有空調,廚房裡有洗碗機,車庫里停著自家轎車。
對於無人駕駛汽車這類發明有望帶來的生產率提高(比如讓車主回復電子郵件,而不用轉動方向盤),戈登也不以為然。他說:“生產率的真正提高大概會來自無人駕駛卡車。”但他隨後指出,UPS的送貨車仍需要司機從車上卸下包裹。
他對機器人的潛力相對更加認同,但不太相信機器人已經強大到足以替代人類。“回想一下過去2、3天里你接觸的每一位員工,再想想未來20年機器人能否取代他們?”
兩位教授的論點都有一個奇特之處——它們有多麽不像經濟學方法。他們的焦點在更大程度上是進一步發現的空間,而不是經濟體做出此類發現的能力。戈登教授的推論很難解釋1996年起生產率增長加速的現象,而布林約爾松教授難以解釋2004年後的增長減速。
然而,對於做出發現的過程,經濟學有相當大量的現成理論,其依據是不那麽革命性的洞見:突破依賴於研發投入。
在最近一項研究中,舊金山聯儲的弗納爾德和美國斯坦福大學(Stanford University)的查爾斯•瓊斯(Charles Jones)用一種不同的方法分析了1950年至2007年美國人均產出每年2%的穩健增長。他們發現,勞動者人均資本的增加幾乎沒有任何貢獻。
約有0.4個百分點的增長來自人力資本(更好的教育)。但遙遙領先的最大貢獻(總增幅中的1.6個百分點)來自一個事實:更多人在從事研究和開發。
總人口增加起到了一定作用。但主要還是因為從事研發的人口比例持續增長。
比起猜測還有什麽技術有待發明,這種分析有助於形成更站得住腳的預測。弗納爾德指出:“這個未來方程式中令人悲觀的變量是人力資本,”原因是更高教育程度的貢獻終將趨緩。美國也不可能持續投入越來越多的人力從事研發。
但這一點對於世界作為一個整體並不成立。中國、印度及其他地方的巨大人口正加入全球經濟,這些國家正在改進他們的教育制度,並把更多的研究人員投入科研第一線。他們做出的任何發現都能應用於美國乃至其他任何地方。
在這種情況下,科學發現所帶來的改進有望達到可持續水平。生產率增長只需要放緩到大約1.6%。若加上人口的適度增長,經濟整體就能以每年2%或略微高一點的速度擴張。弗納爾德的長期預測是2.1%。這似乎表明,聯邦公開市場委員(FOMC)最新預測的2.2%的長期增長比較靠譜。
Climate Corporation展示了美國依然強大的創新能力,以及電腦仍能以出人意料的方式提高生產率。然而,為了延續人均產出每年增長2%這一神奇趨勢,美國可能需要中國和印度出現更多的“硅谷”,讓他們加入對提高生產率的永恆追求,包括在同一英畝的土地再增產一蒲式耳玉米。
英特爾、主頻以及生產率增長的度量
近期的生產率增速放緩是否只不過是一個統計上的誤判?最近,經濟學家大衛•伯恩(David Byrne)、斯蒂芬•奧利納(Stephen Oliner)以及丹尼爾•西謝爾(Daniel Sichel)發布了一份研究報告,指出微處理器的價格和性能數據之間有一種有趣的差異(見上圖)。這一點很重要,因為處理能力的快速提升正是技術革新的驅動力。
英特爾(Intel)的創始人之一戈登•摩爾(Gordon Moore)曾提出了摩爾定律(Moore's Law),認為計算機的性能每兩年就會翻倍。摩爾定律依然有效。與此同時,在1998年到2000年間,微處理器計算能力的單價曾每年下降70%,而最近其下降的速率已經放慢到每年3%到4%。這就意味著生產率增速的回落。
奧利納目前任職於華盛頓智庫機構美國企業研究所(American Enterprise Institute)。他對此有幾點解釋。其一就是英特爾的市場勢力提升了。“2006年轉折點就出現了,大概從那時開始,英特爾實實在在地鞏固了自己相對於AMD的市場地位”,奧利納說。AMD是英特爾的主要競爭對手。競爭變少,老產品降價的速度就可能會變慢。
奧利納先生還說,大約在2006年,“英特爾自身也有了一個重大突破,研發出了多核芯片。”主頻,以兆赫或者千兆赫為單位,是最為人所熟悉的衡量芯片處理速度的方式。英特爾不再一味提高主頻,而是開始在同一芯片卡上集成多個相同的基礎處理器。如果還用主頻來衡量計算能力,那麽其提升的速率看起來會驟然下降。
美國勞工統計局(US Bureau of Labor Statistics)使用一組工具來衡量計算能力。他們得到的數據顯示計算機芯片的發展速度已經大幅放緩。支持這些數據的一條理由是,消費者置換他們的台式機的頻率似乎降低了。
譯者/何黎

2016年8月8日 星期一

Delta Airline, the No. 2 U.S. carrier by traffic said “our systems are down everywhere.”

Planes have since started taking off again but not before wreaking havoc on travel plans around the globe


A power outage at Delta Airline’s headquarters in Atlanta early this morning has caused a global computer failure that halted all flights – stranding tens of thousands of Delta passengers around the world.
I keep being amazed and appalled at how centralized our computer systems have become -- and therefore how a single power failure or computer glitch can bring down an entire global system. Our economic system is simultaneously becoming more and more centralized. Fifteen years ago we had 12 major airlines; now we have just four.
This increasing centralization of computers and Internet communications, combined with increasing centralization of economic power, creates enormous fragility and vulnerability – not just to terrorism but to natural disasters and glitches. Digitization and the Internet were supposed to enable distributed systems – decentralized communications, small-scale sources of power, innovative small businesses, fail-safes of all kinds. We no longer depend on giant economies of scale. Yet we’re ending up with just the reverse: Massive centralization, which is outright dangerous.
What do you think?


Tens of thousands of Delta passengers around the world were grounded…
NBCNEWS.COM|作者:ALASTAIR JAMIESON




Delta Air Lines Says Some Flights Resume but Delays, Problems Remain
Wall Street Journal - 1 min ago


The No. 2 U.S. carrier by traffic said “our systems are down everywhere.”


Airline advising travelers to check status of flights while issue is being addressed
WSJ.COM|由 ROBERT WALL 上傳

2016年8月2日 星期二

Deming Lecturer Award...The 2009 ASA Deming Lecture by J. Stuart Hunter, Princeton University.

https://www.youtube.com/watch?v=ihMk4DorKKA

Published on Jul 26, 2016
The 2009 ASA Deming Lecture by J. Stuart Hunter, Princeton University.
第14屆

對統計師的新品質挑戰

1995 設立講座,1996 第1屆  Brian Joiner  {第四代管理}   作者




View other talks from ASA's 2009 Joint Statistical Meetings conference





*****
http://www.amstat.org/awards/deminglectureraward.cfm

Deming Lecturer Award


Dr. W. Edwards Deming
The Deming Lecturer Award was established in 1995 to honor the accomplishments of W. Edwards Deming, recognize the accomplishments of the awardee, and enhance the awareness among the statistical community of the scope and importance of Deming's contributions. Notable among Deming's achievements are the following:
Contributions to Sampling
  • Wrote one of the first books on survey sampling in 1950, which is still in print
  • One of the earliest writers to consider multiple factors that might affect the quality of survey estimates
  • Quality in sample surveys draws on many ideas from Deming's work on quality improvement
Statistical Contributions to Business and Industry
  • Deming's skills as an extraordinary data analyst, methodologist, and thinker about the foundations of learning from data led to his being the leading analyst of data in the transportation industry.
  • More generally, Deming was a powerful and tireless advocate for the use of statistical methods and thinking for quality improvement in business and industry.
Contributions to Management
  • Developed and disseminated a theory of management (i.e., System of Profound Knowledge) that contributed materially to improved performance of enterprises in many countries, most notably in Japan and the USA.

Current Winner


2015: William Q. Meeker, Iowa State University, "Reliability: The Other Dimension of Quality"
For specific questions about the Deming Lecturer Award, contact the committee chair.

Past Winners


2014: Sharon Lohr, Westat
2013: Vijay Nair, University of Michigan, "Industrial Statistics: Research vs. Practice"
2012: C.F. Jeff Wu, Georgia Institute of Technology, "Quality Improvement: From Autos and Chips to Nano and Bio"
2011: Roger W. Hoerl, GE Global Research, "The World is Calling: Should We Answer
2010: Brent C. James, Institute for Health Care Delivery Research, "Better: Dr. Deming Consults on Quality for Sir William Osler"
2009: J. Stuart Hunter, Princeton University, "Deming Today"
2008: Donald M. Berwick, Institute for Healthcare Improvement, "Inference and Improvement in Health Care"
2007: Douglas C. Montgomery, Arizona State University, “A Modern Framework for Enterprise Excellence"
2006: Ronald D. Snee, Tunnell Consulting, "Making Another World: a Holistic Approach to Performance Improvement"
2005: A. Blanton Godfrey, North Carolina State University, "Statistics, Quality, and Organizational Excellence"
2004: Colin Mallows, Avaya Laboratories, "Deming and Bell Labs"
2003: Wayne Fuller, Iowa State University, "Deming and Survey Sampling"
2002: Sir David Cox, "Current Problems and Challenges Facing Statistics"
2001: Gerald J. Hahn, GE Retiree and Consultant, "The Proactive Statistician"
2000: George Box, Professor Emeritus, University of Wisconsin, "Statistics for Discovery"
1999: Kenneth Prewitt, Director of the U.S. Census Bureau, "Census 2000 - Political Questions, Scientific Answers"
1998: Myron Tribus, Exergy Inc., "The Contributions of W. Edwards Deming to the Improvement of Eduction"
1997: Noriaki Kano, Science University of Tokyo, "Business Strategies for the 21st Century and Attractive Quality Creation"
1996: Brian L. Joiner, Joiner Associates, Inc., "Dr. Deming was a Statistician and Ghandi was a Lawyer"


網誌存檔