As part of my series on data interpretation in L&D (see part I, Part II and Part III to catch up) I thought it would be a nice idea to have a closer look at the 2024 LinkedIn Workplace Learning Report. I did the same in 2022 (not sure why I skipped 2023…), but this time I wanted to take more the data and analytic lens.
Before we start, I want to share that I really like what LinkedIn Learning has done for corporate L&D in the past years. I’ve worked with them at several occasions, I know really good and knowledgeable people who run LinkedIn courses that are worthwhile spending time on. I’ve also seen how employees from companies who offer LinkedIn learning are pleased with the opportunity it provides them to learn.
But I simple could not miss out this opportunities to close read this report from a data & analytics perspective for 3 reasons.
First, this report is possibly one of the most recognized ‘state of the industry’ reports from an L&D seller (so excluding organizations like ATD, CIPD, LPI and the Learning Guild). Secondly, it provides a great learning opportunity for L&D professionals on how to read the report in terms on data and analytics. On what is actually being said, and on what is not being said. And thirdly, given the people I know who work at LinkedIn Learning, they are open for constructive feedback and continuous improvement, and I’m expecting they would welcome my feedback.
So please regard this post as an illustration and opportunity to learn how to interpret data in L&D. Let’s go!
Data Sources
The first thing to always consider when reading such a report is “what are the data sources”. Most reports wait until the very last section, or even appendix to explain this to you, in the hope that you would not really read is. And I was very pleased to see that this report mention the sources in the introduction! Well done!
What this information is telling us is the following:
Survey Results: Page 36 (at the end indeed;-) tells us that LinkedIn got 1.636 survey responses from L&D and HR professionals as well as 1.063 responses from ‘learners’. I assume these learners are members of LinkedIn Learning. A couple of initial thoughts. First, I think the number of responses is fairly low considering that there might be 2-3 million L&D professionals in the world, but especially only a small % of LL members has taken the effort to complete the survey. This could lead to survivorship bias, a main root cause for misinterpreting data. Secondly, you need to keep in mind that a survey is typically considered (at least in data science) as data source with questionable reliability. I explain this more in my post on skill assessments. The example there is the fact that you ask people if they have a certain skill, does not prove that they have that skill! Any data point that comes out of a survey (or rating for that matter) always represents a persons opinion and behavior. It never represents any fact, other than stating it is that persons response to a question. So any conclusions from survey data should ideally always be stated in terms of ‘people think’, or ‘survey results indicate’. Any comments stronger than that, and especially conclusions that are presented as fact or evidence, should always be taken with a dose of salt….
LinkedIn Behavioral Data: No doubt this is what is making the report worthwhile reading (I hope) because LinkedIn has tons of data that holds unbelievable potential value. So much value in fact that I expect they only share a fraction of what that data represents. But that is fully understandable given they are a commercial company! One thing to keep in mind though is that irrespectively of how much data LinkedIn collects….it has no access to data from outside LinkedIn. Now given that many professionals have a LinkedIn account, and that according LinkedIn itself (!) 77% of recruitment is done through LinkedIn, this still counts towards a lot of useful data. However, when it comes to L&D we need to ask the question: “How much learning is done through LinkedIn Learning?”. And I honestly have no idea where and how to find the answer to that question. But I do know it is way and way lower than 77%. So all observations done through data from LinkedIn Learning must be interpreted with the notion that it might only reflect a few % of what and how people learn!
L&D Pros: I was happy to see some great names on the list! And I will be curious to see what they have to say while I am going through the report.
Chapter 1: The state of L&D
The first chapter starts with the top 5 focus area’s. Presenting a top 5 (or top 3, or top 10) is arguably one of the most favorite ways of presenting data. And I have to confess, it’s a great way to get focus on what really matters. And I love the fact that ‘Aligning learning programs to business goals’ is number 1. Both because I feel every learning program should contribute (directly or indirectly) to business goals, and secondly, because this is a key value add of learning analytics. Learning analytics is exactly what you need to connect learning to business goals.
There are however 2 questions I always ask myself when seeing top 5 lists. First is that I would like to know the relative difference between these top 5 items. Is number 1, for example, far ahead of the rest and a clear ‘winner’. Or is number 1 only marginally different from number 2? This makes a huge difference when it comes to making decisions to for example focus on all 5, or just on number 1.
The second question I would always ask is what percentage the top 5 represents from all listed and voted for focus area’s. Does the top 5 represents 90% of the votes? Or maybe only 10% of the votes? Again, the answer to this questions makes a huge difference if you want to make decisions on where to focus. Because with a top 5 covering 90% of the votes, you can be sure that they top 5 should have your focus. But with a top 5 that represents only a small portion of the votes, that decision becomes more complex.
On the next page, we find an interesting data visual. It contains 3 statements: ‘4 in 5 people want to learn more on how to use AI in their profession‘, ‘learners who set career goals engage with learning 4x more than those who don’t set career goals‘, and ‘90% of organizations are concerned about employee retention and providing learning opportunities is the No. 1 retention strategy‘. So. let’s have a closer look at each of these statements.
The first statement ‘4 in 5 people want to learn more on how to use AI in their profession‘ is actually one that leaves very little room for discussion other than the scope of the data behind the statemen. When reading such a statement, you always need consider the question: “What people”. because the biggest mistake in interpreting this data is to think that 4 in all 5 people in the world, or 4 in all 5 white collar workers, or 4 in all 5 employees in your organization think this way. Because that is NOT what this statement is telling you. A more accurate version of this statement would be ‘4 in 5 people who have a LinkedIn Learning membership and who have responded to our survey want to learn more on how to use AI in their profession‘. I fully understand that LinkedIn does not write it down like this. That is why it is so important that we do read it like this!
The second statement ‘learners who set career goals engage with learning 4x more than those who don’t set career goals‘, is more interesting. Because I actually think and suspect that there is not enough evidence in the data to support this. What typically happens in these type of analysis, is that the data on people with career goals is correlated with the time people spend on learning. And then the data shows that people who set career goals spend, on average, more time learning through LinkedIn Learning. Now, I am not questioning the correlation, as I expect that that is a real correlation. However, I do question the level of certainty with which this correlation is presented. Most importantly because the statement written in this fashion, suggest a causation that is not really there. The statement suggests that as a result of setting career goals, learners will spend more time learning. Now, it’s very important to understand that I am not saying this is not the case. But I would question if LinkedIn has sufficient data to prove this causality. There might well be other reasons why there is a correlation between setting career goals and learning. Maybe its the other way around; people who are life long learners have the tendency to set career goals more than employees who see learning as a distraction from work? Or maybe there’s a third factor that drives both career goal setting and learning?
The final statement ‘90% of organizations are concerned about employee retention and providing learning opportunities is the No. 1 retention strategy‘ is in my opinion the most tricky one. There’s actually 2 statements in one. The first one (‘90% of organizations are concerned about employee retention‘) is reasonably informative with the exception that I feel that saying ‘90% or HR and L&D professionals are concerned about employee retention‘ is a much more accurate representation of the data, simply because these were the ones questioned on this (assuming this was a question in the survey). Also this statement does not say anything about other concerns mentioned. And theoretically it could be the case that there are other concerns that score equal or even higher than 90%! One of the most famous ‘misleading data’ examples ever was the claim Colgate made in the UK in 2007 that ‘More than 80% of dentists recommend Colgate’. While this was not untrue. They conveniently left out the fact that more brands were recommended by dentists, some even more than Colgate. The consequence: the add was banned after a lawsuit and proven to be misleading.
There is also a big assumption made here. Which is that HR and L&D professionals accurately represent the whole organization. Especially when it comes to the second statement here that ‘providing learning opportunities is the No. 1 retention strategy‘. I think this statement can be read in different ways. Do the mean that learning opportunities is regarded (by HR and L&D) as the No. 1 retention strategy? Or is it the actual No. 1 retention strategy used in companies? Or is it is the recommended No. 1 strategy for other companies?
In addition to the ambiguity of this statement, I would also question to what extend it is influenced by the people questioned. It’s natural that HR and L&D professionals (and especially L&D professionals) would look to learning as the key to employee retention. But does that mean it really is the case? What would be the answer if you asked business leaders, middle management or finance professionals? Would that still give this result? What would be the answer if you would ask employees? And while I am not saying that this statement is not true, I would question to what extend it is influenced by asking only HR and L&D professionals…
The business case for learning is clear.
The next interesting data insight in the report is the section on the business case for learning and LinkedIn’s definition of a learning culture. This is especially interesting because LinkedIn claims that ‘When it’s time to meet with executives, L&D pros can cite new LinkedIn research that demonstrates how learning drives desirable business outcomes.’ I would translate this as a call to all L&D professionals to bring this page to their next meeting with executives. But in all honestly. I am not sure if I would do so.
I do like the idea behind it. And the chart is very well constructed. But if I would be an executive, and I would look at this data, I can’t help but think I would have some questions. And as an L&D professional, you should be prepared to be asked these questions, and more importantly, to be able to answer them.
My first question as an executive would be around a learning culture. I would want to understand what a learning culture is. And why measuring the size of the L&D team, the rate of employee skill development, and the volume of learning related posts on LinkedIn are a solid measure of a learning culture. I would ask questions on the size of the L&D team, as it suggests a bigger L&D team means a better learning culture. But as an executive, I would argue that a bigger L&D team indicates large inefficiency in how learning is organized. I would ask for the meaning of ‘the rate of employee skill development’ (I am honestly not sure what that actually means!), and would for sure question the importance of posting about learning on LinkedIn. I would see that as a very weak, if not totally irrelevant, indication of a learning culture.
My second set of questions would be about the correlation between a learning culture and retention, internal mobility, and manager promotions. I would ask for evidence that a learning culture is indeed the main contributor to these measures. Or are there other causes, programs and/or investments that have led to these outcomes?
If you as a L&D professional plan to show this data to your executive, I would highly recommend you think carefully on these questions and make sure you have a solid answer to each of them!
Chapter 2: Skills Agility
I love the topic of skills. It is so complex, there’s a huge element of analytics in there, and is keeping us occupied for years already (and possible for many years to come). And with AI coming into play, building AI skills is going to be crucial for every company!
I talk about skills and skills measurement a lot. So I was really pleased to see the conclusions from the report that few companies advance to the measurement stage. I recognize this conclusion and see a huge potential for L&D to start embracing skills measurement. With this statement I have let go of my neutral status completely. And I have made a mistake that many people made; I’m less critical on data insights that fit my perspective, confirm my suspicion and support my business model! This to illustrate that I am only human-;)
But all joking aside, the upskilling and reskilling programs described in the statement, are limited to the programs represented by the companies and people who responded to the survey. So it might well be that overall, a much higher percentage of programs have progressed to the measurement stage.
This would be however a nice source to benchmark the state of your own upskilling and reskilling measurement!
Finally, one thing they have done really well in the chart, is the comment at the bottom saying that ‘in each year 4% to 5% had not started their projects‘. this piece of information enables the reader to add the 22 and 24 percentage up to come to a full 100%. As a general rule, when percentages are displayed in a data visual, and they do not add up to 100%, you should be suspicious, as it means that the designer either calculated wrong, or on purpose left data out…
Going back to career goals
The report then goes back explaining how setting career goals is a driver for learning. This actually does support their claim made earlier. But as with the earlier top 5, I would like to see how this top 8 compares to each other and to the rest of the list. Is career goals really standing out? Or is it just marginally higher rated?
And interesting enough, the next visual does demonstrate that LinkedIn understands the importance of providing percentages in a top 5 ranking! The 5 key practices for career development provide insights into how each practice relate to the others in the top 5 by also presenting the percentage of companies who have mentioned it and illustrate these with the length of the bar. You can see that this creates a similar more clear insight as my example above! So well done LinkedIn!
Maybe one small improvement I would suggest. I would always prefer to present the percentage compared to all responses, rather that the percentages of companies who have selected that specific option. In my opinion, this is more accurate. And it allows you to always add the percentages to the full 100.
Oh, and maybe a final comment. In this case the report mentions that the data they have used is restricted to companies who have ‘mature career development‘. So keep that in mind when using this data.
Mobility Matters
The chart that accompanies the mobility discussion is an interesting one for a couple of reasons.
First it illustrates that you always need to be careful when using percentages. In this case, the percentages do not add up to a 100. This suggests that more than one answer was possible as is confirmed by LinkedIn in their comment that in many cases ownership of internal mobility sits with multiple people. This additional context is vital in this case, as ownership is typically associated with a single person or department. And that is a risk because people might not read the text, or the visual is copied without this context and anyone who is skeptical to your story and can add to a 100 could mistrust the data and insight because the percentages do not add to 100. So be careful presenting data like this!
My second observation is that the options provided in the survey are potentially limited and selective. This is the danger of any survey that limits responses. In this case and context, I would expect options around ‘the supervisor/manager’, ‘the employee’, ‘business leaders’ of the sorts. Without being the expert on internal mobility, I would think that each of these at least have a big role to play. By presenting the data like this, you could first get approximate results (which is response where people select the option that does not correspond exactly but is the closest alternative), or people simply not answering the question, because their situation is not represented. That is why I always prefer to include an option ‘other’…
Chapter 3: How L&D Succeeds
Chapter 1 and 2 feel like an extensive introduction to what is the core of the report. Or at least what L&D professionals want to get out of the report: how to be successful.
And naturally I was very excited to see ‘learn in to analytics’ and ‘build the right metrics’ as the number 1 & 2 priorities for L&D! Exactly why SLT Consulting exists, and exactly why we are developing the awesome Learning Analytics Toolkit to help every L&D professional to improve the data and analytics knowledge and skills. And yes, ‘how to select the right metrics’ is one of the key pillars in this toolkit!
I like the mention of ‘54% more L&D pros list analytical skills on their LinkedIn profiles compared to a year ago‘. It suggests that 54% L&D professionals have this in their LinkedIn profile, but that is actually not the case. I would be curious what % of learning professionals that represents. Because after all, 54% seems a high percentage, it could actually represent a low number! This is good to remember as it is a widely used trick in data visualization and story telling: When the number is low, and people want to inflate the importance, they could present a percentage. And when the number is high and people want to deflate the importance, they also might use the percentage!
The same observation can be made from the year over year growth of skills to help L&D succeed. I’m glad that LinkedIn points out the required continuous learning of L&D professionals. But I am less impressed with the data that shows year over year growth in percentage only for the same reason why the 54% more L&D pros have analytics on their profile, you have no idea about the volumes. And look at it from this angle. If you are a small start up, it’s much easier (although not easy!) to grow with double digits. Simply because the numbers are small. A growth from 50k turnover to 75k turnover is a massive 50% increase! But if you are a large global company, a similar growth % would mean growing from 5 billion turnover to 7.5 billion turnover, which represents a massive 2.5 billion increase and is much much more challenging!
It’s also interesting to observe that analytics and metrics are considered the number 1 & 2 priorities for L&D, but that analytical skills is not mentioned in the list? May be it’s number 5 with 56% and it just missed the cut? Again, I refer to my example and comments above on how to interpret top 3,4 or 5 lists.
Closing Remarks
The rest of the report (and some pages I skipped) contain either little data driven insights, or different expressions of observations and challenges already shares in this post.
Overall, I like the frequent use of data in this report, and I’m excited by the call to action including on analytics. I like that this report has a fairly high degree of transparence, which is always needed to gain trust.
However I do see some challenges when trying to correctly interpret the data and room for improvement in how the data is presented. The nice thing (and reason I am writing this post) is that we can learn from this. The observations I have are not unique to this LinkedIn report. There are very common and all around us. In that sense I should both thank LinkedIn for providing these examples and apologize for using them as example, where I could easily have used another report.
But if you want to take data serious, and you want to improve your data interpretation skills, than here’s my list of recommendations that you can apply to any market, business or research report:
Know the source data: Always check the source data. What data was used, but sometimes more importantly, what data was NOT used
Survey data represent opinions, not facts: Be careful when the data source is based on survey data. Surveys provide data that represents opinions. And no matter what advanced analytics you deploy, it can never change an opinion into a fact!
Beware of top 5 lists Do not rush to conclusions when seeing top 5 (or 3, or 4 or 10) lists. Always consider two thinks: How the top 5 relates to each other, and how the top 5 relates to the rest
Definitions and assumptions: Make sure you check what definitions are used and if you agree with them. Also check for (sometimes hidden) assumptions made.
Percentages can be tricky: be careful with percentages! Especially if they do not add up to 100%
Big % and small numbers: Be aware that a big % could represent only a small absolute number. This is one of the most common misleading data presentation tricks!