Sign in to follow this  
Hussars

The Death of Expertise

Recommended Posts

Hi all,


 


At the beginning of the year, there was an article released which describes the writer's belief in-which a large impact of the change in discussions (especially on the internet) is the change in understanding of the value of expertise on any given subject.


 


Now before anyone assumes otherwise from my posting the link, this is not intended as a slight towards anyone posting on any forum specifically, just looking to get opinions on the point the writer makes.


 


In an age where "sense of entitlement" is considered to be a growing "epidemic", might it be more that the changing way people perceive each other and their experiences is actually the larger issue?


 


We have a level of access to not only information, but each other, that as recently as a decade ago was unthought-of by a vast portion of the world.  The younger generations are growing up in a world with such immediate access to information, where the "older" generations are adapting to these changes and many still have "old world" views, so that it might not be a "sense of entitlement" but rather a change in the understanding of expertise's value on a given topic.


 


http://thefederalist.com/2014/01/17/the-death-of-expertise/


 


Thoughts?


  • Like 1

Share this post


Link to post
Share on other sites

We live in the information age, many of us who like me grew up on the precipice of this new age are, as you say, very diffrent people then those that have for their entire lives taken it for granted that they only need to strech out their hand and grasp information from the net.     Information is not knowledge.    Being abe to learn things isn't the same ask knowing how they work in practice, or being able to sort fact from fancy, or deliberate lies.    


 


Its not to say teacher are not getting better at dealing with this kind of thing, When I first started using the internet, around 1997-98 there were few ways for teachers to tell if you were plagiarizing.  Also we would present news articles and summarize them.   These days there are tools that help identify plagiarism, and often I see teachers getting students to make comparative essays on a news topic that takes articles and compares them rather than just parrots the information.    


 


As we adapt to the tools we have been given we get better at understanding how to use them.   You're right in a way, many kids I've talked to think of the world as "Entitled" to give up their information.   I grew up mostly researching things from books and wrangling what I could out of a library. The information I used from the net was so vast it changed the way I thought about the world when I first started using it.    


 


Eventually I found I was becoming dependent on it.  When I realized just how much of the information we rely on at face value, that we are "entitled" to know I really was shocked its not an idea that just the younger generation has, its everyone including ourselves.    The people who did not grow up with that vast mountain of data at their fingertips often are harder at putting it that way.   So what happens if this information is wrong?  Or you looked for the wrong or even outdated information you took at face value?


 


What is the "value" of information?    It is always in how you can use it.   Teaching someone how to find something isn't the same thing as expecting them to understand how it works or even being able to effectively use it.  You have to teach people how to use the internet properly, not just expect information to be there.  How to find current data, how to use reasoning,logic and perspective to sort though things and discover not just information, but to gain knowledge.   


Edited by Battlepaw
  • Like 2

Share this post


Link to post
Share on other sites

Software Developers needed. Candidates applying for the job position should be experts at:


Java, Javascript, Assembly, Python, Ruby, .NET, HTML, HTML5, PHP, SQL, VBasic, Scala, Groovy, Rails, Dart, CSS, C, C++, CofeeScript, C#, Clojure, ColdFusion and Cobol. Any extra expertise on programming languages starting with the capital letter C will be taken into consideration.


 


 


So what does it mean to be an expert in todays world ?


 


I personally believe that experts are not dinosaurs and that the main point of the article is both interesting and probably correct. What really changed in todays world is that we, the people can see each other. And because we can see each other we can share with each other. Knowledge and experiences are now public since we are all connected through modern telecomunications. We don't need anyone to bridge and proxy us with the rest of the world. We don't need the anchorman to tell us the news. We don't need the teacher and the professor to teach us sciense. We don't need the doctors verdict but his confirmation.


 


The average human today is much more educated than the average human twenty years ago. This doesn't mean though that we are experts in every field. We have the potential to be an expert but just because the knowledge is out there it doesnt mean we can consume it as easily as we can access it. I believe what is changing is the gap between the average human and the expert. It's getting shorter. Much shorter and it's happening fast. The expertise is still needed but not into the extend it was needed before.


Edited by Issle
  • Like 1

Share this post


Link to post
Share on other sites


 




We live in the information age, many of us who like me grew up on the precipice of this new age are, as you say, very diffrent people then those that have for their entire lives taken it for granted that they only need to strech out their hand and grasp information from the net.     Information is not knowledge.    Being abe to learn things isn't the same ask knowing how they work in practice, or being able to sort fact from fancy, or deliberate lies.    


 


Its not to say teacher are not getting better at dealing with this kind of thing, When I first started using the internet, around 1997-98 there were few ways for teachers to tell if you were plagiarizing.  Also we would present news articles and summarize them.   These days there are tools that help identify plagiarism, and often I see teachers getting students to make comparative essays on a news topic that takes articles and compares them rather than just parrots the information.    


 


As we adapt to the tools we have been given we get better at understanding how to use them.   You're right in a way, many kids I've talked to think of the world as "Entitled" to give up their information.   I grew up mostly researching things from books and wrangling what I could out of a library. The information I used from the net was so vast it changed the way I thought about the world when I first started using it.    


 


Eventually I found I was becoming dependent on it.  When I realized just how much of the information we rely on at face value, that we are "entitled" to know I really was shocked its not an idea that just the younger generation has, its everyone including ourselves.    The people who did not grow up with that vast mountain of data at their fingertips often are harder at putting it that way.   So what happens if this information is wrong?  Or you looked for the wrong or even outdated information you took at face value?


 


What is the "value" of information?    It is always in how you can use it.   Teaching someone how to find something isn't the same thing as expecting them to understand how it works or even being able to effectively use it.  You have to teach people how to use the internet properly, not just expect information to be there.  How to find current data, how to use reasoning,logic and perspective to sort though things and discover not just information, but to gain knowledge.   





I agree, and as someone who recently started going back to school to finish my post-grad studies, I have seen a noted change in the way classes/seminars are handled.  Not sure yet if it's a good or bad change overall, but it is different.


 



 




Software Developers needed. Candidates applying for the job position should be experts at:


Java, Javascript, Assembly, Python, Ruby, .NET, HTML, HTML5, PHP, SQL, VBasic, Scala, Groovy, Rails, Dart, CSS, C, C++, CofeeScript, C#, Clojure, ColdFusion and Cobol. Any extra expertise on programming languages starting with the capital letter C will be taken into consideration.


 


 


So what does it mean to be an expert in todays world ?


 


I personally believe that experts are not dinosaurs and that the main point of the article is both interesting and probably correct. What really changed in todays world is that we, the people can see each other. And because we can see each other we can share with each other. Knowledge and experiences are now public since we are all connected through modern telecomunications. We don't need anyone to bridge and proxy us with the rest of the world. We don't need the anchorman to tell us the news. We don't need the teacher and the professor to teach us sciense. We don't need the doctors verdict but his confirmation.


 


The average human today is much more educated than the average human twenty years ago. This doesn't mean though that we are experts in every field. We have the potential to be an expert but just because the knowledge is out there it doesnt mean we can consume it as easily as we can access it. I believe what is changing is the gap between the average human and the expert. It's getting shorter. Much shorter and it's happening fast. The expertise is still needed but not into the extend it was needed before.





 


Expert vs Expertise, I'd offer an idea similar to the author's (likely considered antiquated by some), and since it is more of a tangent than an actual part of my response, it's behind the spoiler tag.



You can have expertise in a subject's practical aspect or theory, where the expert has both with repeated actual application over an extended period of time (appropriate to the subject) producing results, which are considered by others with the same subject expertise, to be accurate/correctly executed/etc...


 


In the case of programming, someone with expertise would be a person completing enough class work to have an in-depth knowledge of "Best Practices" for the language OR someone who has learned the same level of understanding from trial and error (effectively self taught practical).


 


An expert would have both the formal education on the theory/background of the language, as well as long term use of the language itself in multiple usage cases (coding 1 or more large scale applications over say 1 to 2 years, or X number of smaller applications/reference libraries over Y number of years)  However, outside recognition by others considered to be at the same level of expertise, or higher, would be needed to establish the person as an expert.


 


In a gross oversimplification, in any job posting, there are "Tin Badges" used to establish an expected minimum level of knowledge (education and work experience being the most basic of them).  The interview is really establishing the applicant's actual level of expertise.



 


For education, I'd argue that while there is greater access to information, the actual levels of education have decreased.  Or rather, the areas of education traditionally measured have not correctly adapted to the new world view on information.


 


Using the U.S. as an example (since everyone loves to bash our education system anyway lol).


 


Math classes previously required long form proof of equations to validate a student's understanding of the concept.  Even if you were able to derive the correct answer, if you couldn't do it with this "proof", you failed the answer.  Now they are allowed to use calculators/computers in most aspects of mathematical education, since if you get the correct answer, you likely understand the basis of the theory.


 


There used to be a requirement in primary schooling for formal script/cursive as well as plain handwriting requirement, now schools are either phasing this requirement out or considering to do so depending on which state the school is in.


 


The logic being that since the majority of communication now occurs "online" or in digital medium, keyboard skills are more important.  The side effect is that a student's ability to accurately spell the words they are using is also shown to decrease since spell checkers are built-in on most word processor applications and web browsers.  I know just typing this response I've used the auto spell checker several times myself.  Even knowing how to spell the word correctly, I have started to rely more on the system to catch mistyping errors more than actually proof reading my post (which I also do on topics like this one where the discussion actually matters).


 


So now we have Language Arts classes needing to reestablish where their major data points need to come from, since spelling used to be one of the three main points of the "formal" education (Vocabulary, Spelling and Usage/Grammar).


 


When was the last time you saw any of these matter in the average forum post or online discussion?


 


Ask the average American forum user under the age of 20, who were the last 15 U.S. presidents?  (To be fair, I'm well over 20 and honestly would likely need to look them up myself lol... everything before Carter just gives me nightmares...)


 


The point of this wall of text?


 


Valuation of Expertise is needed more now than ever before, specifically because of the amount of readily available information.  Having a wide breadth of general knowledge is not a bad thing, but there will always remain a need to have those who focus on a more in-depth knowledge as well.


 


By simple limits of humanity, it is difficult (if not impossible) for everyone to have both.  We are all different in our ability to store and process the information we have access to, we all learn differently, and we all have different areas of interest.


 


We may overlap in various aspects, but someone will always have more knowledge, better ability to use that knowledge or even better access to information on a topic, and being human, they don't always share unless there is a reason for them to.


Share this post


Link to post
Share on other sites

The only science I believe in is the bible and everything you all speak of about this 'logic' is wrong.


 


MURICA!


Edited by Object
  • Like 1

Share this post


Link to post
Share on other sites

I saw that article. I've got mixed feelings about it, and I think the core of the problem is more that people don't realise how completely different knowledge and experience are. You can study a subject until you're blue in the face, but applying that knowledge in the real world leads to experience.


 


Experts don't just read up on their subject matter, they've been immersed in it. They've seen what works and what doesn't. They've got programmed instincts which will tell them right from wrong long before the textbook or wikipedia article would. It's foolish to ignore such people.


 


On the other hand, the democratisation of knowledge via the internet is, in my opinion, the single greatest advance since writing itself. Nobody has monopoly on knowledge anymore, and it's really difficult to squirrel it away like an old-school scholar to protect your power. And that's a good thing.


  • Like 2

Share this post


Link to post
Share on other sites

if you look at any mba grad recently you can see the difference in expert vs expertise. I personally have people with lower levels of education, comparatively, that handle business more effectively.

  • Like 1

Share this post


Link to post
Share on other sites

if you look at any mba grad recently you can see the difference in expert vs expertise. I personally have people with lower levels of education, comparatively, that handle business more effectively.

 

There was an interesting discussion on the effect/impact of MBA holders on current business practices (I think it was on LinkedIN, and I'll see if I can find a link to it) that I believe came to the same conclusion.  Based on the individual's ability to put theory into practice you get varying benefit (if any at all), while those holding the MBA generally felt they had the knowledge to run companies effectively (sometimes even when shown they were the cause of the problems).

 

At the same time, the larger issue with a good portion of MBA's is that they are repeating the same practices as their predecessors (because that is what they learned to do), which the last few years has shown to be a longer term problem in modern business.  Those able to adapt the theory into new practices are the folks that tend to succeed and exceed in business goals.

 

 

I saw that article. I've got mixed feelings about it, and I think the core of the problem is more that people don't realise how completely different knowledge and experience are. You can study a subject until you're blue in the face, but applying that knowledge in the real world leads to experience.

 

Experts don't just read up on their subject matter, they've been immersed in it. They've seen what works and what doesn't. They've got programmed instincts which will tell them right from wrong long before the textbook or wikipedia article would. It's foolish to ignore such people.

 

On the other hand, the democratisation of knowledge via the internet is, in my opinion, the single greatest advance since writing itself. Nobody has monopoly on knowledge anymore, and it's really difficult to squirrel it away like an old-school scholar to protect your power. And that's a good thing.

 

Ultimately, in such a system, how do you educate yourself about any topic and how do you ensure there is any accuracy when nothing needs to be validated?

 

[Redacted due to this not being a poli-sci discussion but have a quote from C.S. Lewis - Screwtape Proposes a Toast behind the spoiler tag]

Democracy is the word with which you must lead them by the nose. The good work which our philological experts have already done in the corruption of human language makes it unnecessary to warn you that they should never be allowed to give this word a clear and definable meaning. They won’t. It will never occur to them that democracy is properly the name of a political system, even a system of voting, and that this has only the most remote and tenuous connection with what you are trying to sell them. Nor of course must they ever be allowed to raise Aristotle’s question: whether “democratic behavior†means the behavior that democracies like or the behavior that will preserve a democracy. For if they did, it could hardly fail to occur to them that these need not be the same.

 

You are to use the word purely as an incantation; if you like, purely for its selling power. It is a name they venerate. And of course it is connected with the political ideal that men should be equally treated. You then make a stealthy transition in their minds from this political ideal to a factual belief that all men are equal. Especially the man you are working on. As a result you can use the worddemocracy to sanction in his thought the most degrading (and also the least enjoyable) of human feelings. You can get him to practice, not only without shame but with a positive glow of self-approval, conduct which, if undefended by the magic word, would be universally derided.

 

The feeling I mean is of course that which prompts a man to say I’m as good as you.

 

The first and most obvious advantage is that you thus induce him to enthrone at the center of his life a good, solid, resounding lie. I don’t mean merely that his statement is false in fact, that he is no more equal to everyone he meets in kindness, honesty, and good sense than in height or waist measurement. I mean that he does not believe it himself. No man who says I’m as good as you believes it. He would not say it if he did. The St. Bernard never says it to the toy dog, nor the scholar to the dunce, nor the employable to the bum, nor the pretty woman to the plain. The claim to equality, outside the strictly political field, is made only by those who feel themselves to be in some way inferior. What it expresses is precisely the itching, smarting, writhing awareness of an inferiority which the patient refuses to accept.

 

And therefore resents. Yes, and therefore resents every kind of superiority in others; denigrates it; wishes its annihilation. Presently he suspects every mere difference of being a claim to superiority. No one must be different from himself in voice, clothes, manners, recreations, choice of food: “Here is someone who speaks English rather more clearly and euphoniously than I — it must be a vile, upstage, la-di-da affectation. Here’s a fellow who says he doesn’t like hot dogs — thinks himself too good for them, no doubt. Here’s a man who hasn’t turned on the jukebox — he’s one of those goddamn highbrows and is doing it to show off. If they were honest-to-God all-right Joes they’d be like me. They’ve no business to be different. It’s undemocratic.â€

Edited by Hussars

Share this post


Link to post
Share on other sites

There was an interesting discussion on the effect/impact of MBA holders on current business practices (I think it was on LinkedIN, and I'll see if I can find a link to it) that I believe came to the same conclusion.  Based on the individual's ability to put theory into practice you get varying benefit (if any at all), while those holding the MBA generally felt they had the knowledge to run companies effectively (sometimes even when shown they were the cause of the problems).

 

At the same time, the larger issue with a good portion of MBA's is that they are repeating the same practices as their predecessors (because that is what they learned to do), which the last few years has shown to be a longer term problem in modern business.  Those able to adapt the theory into new practices are the folks that tend to succeed and exceed in business goals.

 

 

 

Ultimately, in such a system, how do you educate yourself about any topic and how do you ensure there is any accuracy when nothing needs to be validated?

 

Why does your question contain a false statement? On the internet, everything needs to be validated, and that's basically your answer. I think most people have a scepticism they apply to internet information that could be of great benefit if they also applied it to textbooks.

 

As for MBAs... Is the problem the MBA, or the fact that the majority of people who enter Management as a career are manifestly unsuited to it? Is the problem the MBA or that a LOT of people popping out of a university are under the misapprehension that their learning is over, rather than it has just begun? Or that university courses (in a lot of fields) are years behind the latest research and thinking - business administration and IT being good examples usually.

Share this post


Link to post
Share on other sites

Software Developers needed. Candidates applying for the job position should be experts at:

Java, Javascript, Assembly, Python, Ruby, .NET, HTML, HTML5, PHP, SQL, VBasic, Scala, Groovy, Rails, Dart, CSS, C, C++, CofeeScript, C#, Clojure, ColdFusion and Cobol. Any extra expertise on programming languages starting with the capital letter C will be taken into consideration.

 

 

So what does it mean to be an expert in todays world ?

Meh I see that kind of job offer all the time. What it proves is that the person writing the advert has no expertise in the field they are advertising for. I also see plenty of "Need ten years experience" for languages which have only existed for four. I know your example is made up but it really is common.

 

Expertise is earned understanding in a subject. It requires time and effort and generally deserves respect. Nothing has changed here

 

An expert can have expertise but unfortunately all they actually need is an opinion and a good line in male bovine excrement. Nothing has changed here either.

 

Expertise = Signal

Expert = Noise

 

Things are good if you have a good signal to noise ratio.

 

The whole sense of entitlement thing is just a red herring though, before the Interwebs the "experts" were found propping up the bar in the local pub, now they drink supermarket  beer while valiantly battling the enemies of ignorance online

Share this post


Link to post
Share on other sites

When the article talk about  Jenny McCarthy... I remember it with disgust.


I used to have a nephew that is autistic, deaf, and have leaking lungs from birth. With these complications, she have to eat many medicines and undergo blood-washing.. which are just there to give her a bit more lifetime. It was all clear though that with the knowledge in our country and the money we have, we can't do a thing and just have to give her the best life we can give until she took the forever sleep. If I am not wrong, she deceased at age 17, with mentality of 6 years old and body growth rate of 10 years old.


 


I visited her few months before her death. And felt that I could try to use my expertise in googling to find alternatives. I met one of those websites that claim it is possible to improve autism condition through proper diet. I was excited, thinking that.. like usual, Internet share knowledge that I might able to use... afterall, I learn to make game from internet.. Then when I get to the page about the proper diet I so disgusted I want to slap this woman that own the website so badly. Because her story of autistic child and her wish to help other parents were felt so noble. Except the diet advice is money tagged, including monthly-fee therapy advice and such. Enraged by that, I decide to research further of these claims and found them living in a lie they wished to believe. Selling advice in internet reminds me those single page website that promote questionable books about making money in internet (or in a game). But next I can't believe they doing it is they say doctors and pharmacies are doing a conspiracy with government with those vaccines. They are there as population control and make sure that people keep spending money on medicines. That is very depresing... they dehumanize the doctors, forgetting that doctors, just like them, have hope to be able to help.


 


Before that event, I already know that internet have both false and true knowledge. And for things that require expertise, the advice should be taken as grain of salt. To find something absurd as what the anti-vaccine group do though shows the danger of internet to those who desperate.


  • Like 1

Share this post


Link to post
Share on other sites

The beginnings of wisdom is having a grasp on just how much one does not know.


  • Like 3

Share this post


Link to post
Share on other sites

Why does your question contain a false statement? On the internet, everything needs to be validated, and that's basically your answer. I think most people have a scepticism they apply to internet information that could be of great benefit if they also applied it to textbooks.

 

As for MBAs... Is the problem the MBA, or the fact that the majority of people who enter Management as a career are manifestly unsuited to it? Is the problem the MBA or that a LOT of people popping out of a university are under the misapprehension that their learning is over, rather than it has just begun? Or that university courses (in a lot of fields) are years behind the latest research and thinking - business administration and IT being good examples usually.

 

Repsonse to "Validated"

Either I did not present that statement correctly, or it was misunderstood in the context of usage, either way my apologies as I should have framed it better.

 

My meaning was that in any media format, there is (or should be) a certain level of "healthy" skepticism.  Such as in History texts it's with the understanding that what is recorded is filtered by those penning the contents, and thus subject to coloring based on personal view of the author or editor.  If we're performing an in-depth study of that content, we attempt to validate the content based on cross referencing sources as well as opposing views to arrive at our own interpretation.

 

In the context of general education (which usually only serves as a basis for establishing a basic understanding of a topic and to open the way for future studies), a history book is used based on the assumption that it's content has been vetted by those with the ability and knowledge to do so accurately, and so stands as a "validated" general reference for the topic being taught.

 

In a solely self taught version of the general education example, two people in the same study group using only the internet can arrive at diametrically opposed views on a given topic because their sources were not vetted to ensure they were working from a single source, so it is not a "validated" source (even if the content might have been).

 

Using the "discovery" of America as an example.  Depending on the sources an individual uses, you can find it stated as one of several dates, and that's before you get into the "conspiracy theories".

 

There are few other easily seen cases of this, usually associated to "major" human achievements (**cough**invention of the internet **cough**)

 

Because of the nature of the medium, there is little to no enforced oversight on the content published, thus as a medium it is not "validated", while (again) some of the content on it may be.

 

In terms of general education, how do you measure someones "learning" in such a system?

 

MBA's

People can be "drawn" to management education for any of several reasons, each person has their own and the comments usually gets closer to a discussion on religion or politics the longer it goes on.

 

Hells, the belief in Myers-Briggs as a guiding force alone in career and/or hiring selection makes me shudder.

 

Honestly, my personal opinion on it is that it is a combination of all of it, and again in my opinion, the largest issue continues to be that a very visible portion of those who follow that track, also tend to be less flexible in their strategies (i.e Strategic Planning and Execution) at a corporate level.  I'll not say all or even more than half, but there have been a lot of visible failures in the last decade who followed "the MBA playbook" play-by-play.

 

Which brings me to discussions like this, I enjoy the discussion, and it helps me to research, collect the thoughts of others, and to continue to educate myself in ways I might not do on my own :)

 

@Gaeron

 

True, but at least in pubs/bars you get to see their face, the owner can kick them to the curb, and you can only get so many of them in a single place.  The internet allows for them to gather to a critical mass (pun not intended) and to do so behind masks which are much harder to penetrate for the "general populace" specifically because there is such a huge library of information at their finger tips.

Edited by Hussars

Share this post


Link to post
Share on other sites

the more forums I read the better I like this klaa guy lol

  • Like 1

Share this post


Link to post
Share on other sites

Either I did not present that statement correctly, or it was misunderstood in the context of usage, either way my apologies as I should have framed it better.

 

My meaning was that in any media format, there is (or should be) a certain level of "healthy" skepticism.  Such as in History texts it's with the understanding that what is recorded is filtered by those penning the contents, and thus subject to coloring based on personal view of the author or editor.  If we're performing an in-depth study of that content, we attempt to validate the content based on cross referencing sources as well as opposing views to arrive at our own interpretation.

 

In the context of general education (which usually only serves as a basis for establishing a basic understanding of a topic and to open the way for future studies), a history book is used based on the assumption that it's content has been vetted by those with the ability and knowledge to do so accurately, and so stands as a "validated" general reference for the topic being taught.

 

In a solely self taught version of the general education example, two people in the same study group using only the internet can arrive at diametrically opposed views on a given topic because their sources were not vetted to ensure they were working from a single source, so it is not a "validated" source (even if the content might have been).

 

Using the "discovery" of America as an example.  Depending on the sources an individual uses, you can find it stated as one of several dates, and that's before you get into the "conspiracy theories".

 

There are few other easily seen cases of this, usually associated to "major" human achievements (**cough**invention of the internet **cough**)

 

Because of the nature of the medium, there is little to no enforced oversight on the content published, thus as a medium it is not "validated", while (again) some of the content on it may be.

 

In terms of general education, how do you measure someones "learning" in such a system?

 

 

 

MBA's

 

People can be "drawn" to management education for any of several reasons, each person has their own and the comments usually gets closer to a discussion on religion or politics the longer it goes on.

 

Hells, the belief in Myers-Briggs as a guiding force alone in career and/or hiring selection makes me shudder.

 

Honestly, my personal opinion on it is that it is a combination of all of it, and again in my opinion, the largest issue continues to be that a very visible portion of those who follow that track, also tend to be less flexible in their strategies (i.e Strategic Planning and Execution) at a corporate level.  I'll not say all or even more than half, but there have been a lot of visible failures in the last decade who followed "the MBA playbook" play-by-play.

 

Which brings me to discussions like this, I enjoy the discussion, and it helps me to research, collect the thoughts of others, and to continue to educate myself in ways I might not do on my own :)

 

I still don't think I understand what you're saying about learning. There are some things you can learn which are verifiable facts. For example, you can verify with some degree of certainty at what date Columbus 'discovered' the 'new world'. On the other hand, there are other things which are more a matter of opinion. For example, how to manage people - the vast majority of what is written on this subject is the personal opinion of one academic or another, or one manager or another. Very little on this subject is actually published scientific research.

 

If I may extend your "discovery of America" example - the fact is that there is plenty of evidence that Vikings discovered north America centuries before Columbus, and before the Vikings there are hints at even earlier explorers who went west following the diminishing populations of Walrus... To look at the same question another way, you and I both know that the earliest discovers of America almost certainly arrived from Siberia about 8-10,000 years ago. So even in an area where we think we can "verify" the facts and teach "one truth", in reality it just isn't that simple. The most we can say is that we're pretty sure Columbus landed in 1492 AD.

 

As to how I can measure someone's learning from an open system - well, I can't speak for academia, but when hiring technical staff, I quite literally give them a written exam (useful for gauging their knowledge of a topic), and then I interview them with an expert in their field to validate their experience and get a feel for how they apply their knowledge. There is another interview after that, which out of the scope of this discussion.

 

One problem I think with your average MBA is an apparent lack of distinction between fact and opinion. Structured learning all too often leads to dogmas which are founded on personal opinions, and that's how you can get people who have had four or more years of training on a given subject making counter-productive decisions.

 

You mention Myers-Briggs. Good example. A highly interesting framework and an insightful way of categorising people's brains - but there is little to no scientific evidence for its accuracy in predicting overall behaviour, let alone competencies. In my own practical experience of people management, Myers-Briggs was useful more for the individuals to think about themselves than for me to make use of their abilities. A far more useful framework (which I am reasonably certain no MBA speaks of) is the 'Strengths Finder' system developed by Clifton and published by the Gallup press. I found that system not only useful for individuals to have insight into their abilities, but also for teams to better understand each other, and for myself to better understand (relatively quickly) what kind of responsibilities an individual will excel at - and what approach they may need. That said, I would regard the entire field as experimental, primitive and new. In 20 years' time we'll all probably look back SF 2.0 and laugh.

 

I would hold up Gallup in general, actually, as being one of the very few organisations who publish actual RESEARCH into management.

 

I think I agree about corporate strategy... not much to say there, I've seen very little actual strategy out there. Most corporations behave in a manner more comparable with a simple organism like a slug than a primate. Reaction, not planned action, and certainly not long term planned action.

  • Like 1

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this