Years ago I worked on the shop floor of a manufacturing plant. I had worked my way through college at another plant, so I definitely identified more with the hourly workers than the “suits.” (Even though most of the guys referred to me as “college boy.”)
One day the department manager stopped by. He asked about my background. He asked about my education. He asked about my career aspirations.
“I’d like to be a supervisor,” I answered, “and then someday I’d like your job.”
He smiled and said, “Good for you. I like a guy with dreams.” Then he paused.
“But if that’s what you really want,” he said, looking me in the eyes, “first you need to start looking the part.”
I knew what he was saying but decided to play dumb. “What do you mean?” I asked.
“Look around,” he said. “How do supervisors dress? How does their hair look? How do they act? No one will think of you as supervisor material until they can actually see you as a supervisor — and right now you look nothing like a supervisor.”
He was right. I was wearing ratty jeans with a couple of holes. (Why wouldn’t I? I worked around oil and grease all day.) I was wearing a cut-off t-shirt. (Why wouldn’t I? It was the middle of the summer and the air wheezing through the overhead vents was far from conditioned.) And my hair was pretty long, even for the day.
“But shouldn’t how well I do my job matter more than how I look?” I asked.
“In a perfect world your performance is all that would matter,” he said. “But we don’t live in a perfect world. Take my advice: if you want to be promoted into a certain position… make sure you look like the people in that position.“
I’ve thought about that conversation a lot over the years.
I’ve hired and promoted people who looked the part… and they turned out to be all show and no go. I’ve hired and promoted people who didn’t look the part at all… and they turned out to be superstars. I’m convinced that how you look and, at least to a large degree how you act, has nothing to do with your skill and talent and fit for a job.
Still, he’s right: the world isn’t perfect. People still make assumptions about us based on irrelevant things like clothing and mannerisms… and height and weight and age and gender and ethnicity and tons of other qualities and attributes that have absolutely no bearing on a person’s performance.
So are you better off trying to conform?
Unfortunately, probably so. The people doing the hiring and promoting are people — and people tend to be biased towards the comfortable and the familiar. People tend to hire and promote people who are much like themselves. (If you remind me of me… then you must be awesome, right?)
Besides, highly diverse teams are like unicorns — we all know what one should look like, but unless you’re NPH you rarely encounter one in the wild.
And don’t forget that hiring or promoting someone who conforms, even if only in dress and deportment, makes a high percentage of the people making those decisions feel like they’re taking a little bit less of a risk. I know I was viewed — admittedly with good reason — as a wild card, and I’m sure that impacted my promotability.
But still: are you better off being yourself and trusting that people will value your skills, experience, talent… and uniqueness?
Sadly I think that’s a move fraught with professional peril. If your goal is to get hired or promoted then expressing your individuality could make that goal much harder to accomplish. (Of course if being yourself in all ways is what is most important to you, by all means let your freak flag fly. Seriously.)
I have no way of knowing for sure, but changing how I dressed — and in a larger sense, tempering some of the attitude I displayed — would likely have helped me get promoted sooner. For a long time I didn’t look the part, didn’t act the part… and I’m sure that made me a less attractive candidate.
But that’s just what I think; what’s more interesting is what you think about fitting in and conforming. How has the way you look or dress affected your career?
THE 7 MOST IN-DEMAND TECH JOBS FOR 2018
The 7 most in-demand tech jobs for 2018
CIO | Jun 6, 2018
From data scientists to data security pros, the battle for the best in IT talent will wage on next year. Here’s what to look for when you’re hiring for the 7 most in-demand jobs for 2018 — and how much you should offer based on experience.
Source: Computer World
AUTOMATION WILL MAKE LIFELONG LEARNING A NECESSARY PART OF WORK
As more companies adopt and learn through digital solutions, and as new forms of employment and investment opportunities strengthen the demand recovery, we expect productivity growth to recover, write James Manyika and Myron Scholes in Project Syndicate.
For years, one of the big puzzles in economics has been accounting for declining productivity growth in the United States and other advanced economies. Economists have proposed a wide variety of explanations, ranging from inaccurate measurement to “secular stagnation” to questioning whether recent technological innovations are productive.
But the solution to the puzzle seems to lie in understanding economic interactions, rather than identifying a single culprit. And on that score, we may be getting to the bottom of why productivity growth has slowed.
Examining the decade since the 2008 financial crisis – a period remarkable for the sharp deterioration in productivity growth across many advanced economies – we identify three outstanding features: historically low growth in capital intensity, digitization, and a weak demand recovery. Together these features help explain why annual productivity growth dropped 80%, on average, between 2010 and 2014, to 0.5%, from 2.4% a decade earlier.
Start with historically weak capital-intensity growth, an indication of the access labor has to machinery, tools, and equipment. Growth in this average toolkit for workers has slowed – and has even turned negative in the US.
In the 2000-2004 period, capital intensity in the US grew at a compound annual rate of 3.6%. In the 2010-2014 period, it declined at a compound annual rate of 0.4%, the weakest performance in the postwar period. A breakdown of the components of labor productivity shows that slowing capital-intensity growth contributed about half or more of the decline in productivity growth in many countries, including the US.
Growth in capital intensity has been weakened by a substantial slowdown in investment in equipment and structures. Making matters worse, public investment has also been in decline. For example, the US, Germany, France, and the United Kingdom experienced a long-term decline of 0.5-1 percentage point in public investment between the 1980s and early 2000s, and the figure has been roughly flat or decreasing since then, creating significant infrastructure gaps.
Intangible investment, in areas such as software and research and development, recovered far more quickly from a brief and smaller post-crisis dip in 2009. Continued growth in such investment reflects the wave of digitization – the second outstanding feature of this period of anemic productivity growth – that is now sweeping across industries.
By digitization, we mean digital technology – such as cloud computing, e-commerce, mobile Internet, artificial intelligence, machine learning, and the Internet of Things (IoT) – that is moving beyond process optimization and transforming business models, altering value chains, and blurring lines across industries. What differentiates this latest wave from the 1990s boom in information and communications technology (ICT) is the breadth and diversity of innovations: new products and features (for example, digital books and live location tracking), new ways to deliver them (for example, streaming video), and new business models (for example, Uber and TaskRabbit).
However, there are also similarities, particularly regarding the effect on productivity growth. The ICT revolution was visible everywhere, the economist Robert Solow famously noted, except in the productivity statistics. The Solow Paradox, as it was known (after the economist), was eventually resolved when a few sectors – technology, retail, and wholesale – ignited a productivity boom in the US. Today, we may be in round two of the Solow Paradox: while digital technologies can be seen everywhere, they have yet to fuel productivity growth.
MGI research has shown that sectors that are highly digitized in terms of assets, usage, and worker enablement – such as the tech sector, media, and financial services – have high productivity. But these sectors are relatively small in terms of share of GDP and employment, whereas large sectors such as health care and retail are much less digitized and also tend to have low productivity.
MGI research also suggests that while digitization promises significant productivity-boosting opportunities, the benefits have not yet materialized at scale. In a recent McKinsey survey, global firms reported that less than a third of their core operations, products, and services were automated or digitized.
This may reflect adoption barriers and lag effects, as well as transition costs. For example, in the same survey, companies with digital transformations under way said that 17% of their market share in core products or services was cannibalized by their own digital products or services. Moreover, less than 10% of the information generated and that flows through corporations is digitized and available for analysis. As these data become more readily available through blockchains, cloud computing, or IoT connections, new models and artificial intelligence will enable corporations to innovate and add value through previously unseen investment opportunities.
The last feature that stands out in this period of historically slow productivity growth is weak demand. We know from corporate decision-makers that demand is crucial for investment. For example, an MGI survey conducted last year found that 47% of companies increasing their investment budgets were doing so because of an increase in demand or demand expectations.
Across industries, the slow recovery in demand following the financial crisis was a key factor holding back investment. The crisis increased uncertainty about the future direction in consumer and investment demand. The decision to invest and boost productivity was correctly deferred. When demand started to recover, many industries had excess capacity and room to expand and hire without needing to invest in new equipment or structures. That led to historically low capital-intensity growth – the single biggest factor behind anemic productivity growth – in the 2010-2014 period.
But, as more companies adopt and learn through digital solutions, and as new forms of employment and investment opportunities strengthen the demand recovery, we expect productivity growth to recover. Myriad factors contribute to productivity gains, but it is the twenty-first century’s steam engine – digitization, data, and its analysis – that will power and transform economic activity, add value, and enable income-boosting and welfare-enhancing productivity gains.
Source: Project Syndicate
WHY AI ISN’T THE DEATH OF JOBS
Companies using AI to innovate are more likely to increase employment, writes Jacques Bughin in MIT Sloan Management Review.
When pundits talk about the impact that artificial intelligence (AI) will have on the labor market, the outlook is usually bleak, with the loss of many jobs to machines as the dominant theme. But that’s just part of the story — a probable outcome for companies that use AI only to increase efficiency. As it turns out, companies using AI to also drive innovation are more likely to increase head count than reduce it.
That’s what my colleagues and I recently learned through the McKinsey Global Institute’s broad-based research initiative aimed at understanding the spread of AI in economies, sectors, and companies.1 We polled 20,000 AI-aware C-level executives in 10 countries to compile a sample of more than 3,000 companies (mostly large), identified distinct clusters within that pool, and ran a variety of scenarios on those clusters to project the effects of AI on employment, revenue, and profitability.
This research and analysis suggest that although AI will probably lead to less overall full-time-equivalent employment by 2030, it won’t inevitably lead to massive unemployment. One major reason for this prediction is because early, innovation-focused adopters are positioning themselves for growth, which tends to stimulate employment. (See “How AI-Based Innovations Drive Employment.”)
Here’s how we expect things to play out in the five clusters of companies we examined.
Enthusiastic innovators, or pioneering companies that make early investments in AI and embrace the disruption it can create in the quest for advantage, adopt a full range of AI technologies and use them to bolster innovation and efficiency. These companies are analogous to what sociologist and communication theorist Everett Rogers called “early adopters” back when he coined the term — they’re intrinsically motivated to use new technology to shape and open markets.2 While this approach is potentially complex in the short term, our analysis shows that by 2030, the profitability of enthusiastic innovators will grow 8% faster than that of the average company on an annual basis, their revenue will grow 4% faster, and their head count will rise 2.2% faster.
Source: MIT Sloan Management