Unknown Speaker 0:24
I have here with me today's Smurti Sudarshan, and I met Smurti on LinkedIn. And she started talking to me about educational data mining, I was completely interested because I had not encountered that topic or what it is or how it's useful or what those use cases are. And so I am just so pleased to have Smarty on today to share with us about data mining and education, and what that means in learning more from her. So, smarty, will you please do a better job of introducing yourself and sharing some of your background? Yeah, sure. By the way, thanks for getting me online. And, you know, giving me this platform to share what exactly is educational data mining? So brief introduction about myself. I work for LinkedIn, as an E Learning training specialist. And I specialize only on the E learning aspect of their learning and development sites. So yeah, that's about a quick introduction. But then I am a data enthusiast. So I love data. And I like what stories data tells. And also, I'm really excited to uncover certain insights what data tells, especially when it comes to education, and learners and those aspects. But to give a quick introduction about educational data mining, let me break this down into a bit more simplistic approach, wherein I'll first talk about what exactly is what what is learner data, then bring in the concept of mining and how exactly is lnd and educational data mining, also called EDM is going to marry each other in the end. So yeah, so yeah, I think one thing about learner data is that, like, if you're on an online class, or if you're like, say, in a classroom setting, and you're like face to face, so what would you Exactly? Do you as an instructor would basically see or read your learner's expressions, right? And then you would go like, if someone's yawning, or if someone's, like, disinterested, and you know,
Unknown Speaker 5:00
up, we call it as educational data mining. So to call us, Ed, oh my gosh, this is so interesting. I already got questions going around. I love all the analogies that you're giving, Smruti. And so the first question I had was, okay, so we are reading their digital body language, which I just love that analogy, because you're so right. I mean, I, my first, you know, place as a trainer was as a face to face type of trainer. And yeah, I mean, as soon as somebody yawns or shows that body language, like you start doing silly things, your voice may get louder, it might get quieter, to see if they're going to pay attention, something surprising.
Unknown Speaker 44:59
The right way to do it, and then you know, go about it. So I think best don't listen to yourself and go about listening to the data. I love that advice. Get rid of your assumptions. Look at the data. That is wonderful advice. Thank you so much Smruti, for coming. And share us about educational data mining, where can we find you and more resources? If we want to go dig further in EDM? Yeah. So I think LinkedIn is the best way to get in touch with me. And also most of my blogs are also on on EDM is there on LinkedIn? If not, there's a blog on my website, like most of the data, part of it is there, written blogs on my website, so you could go ahead and
Robin Sargent 0:00
Welcome to become an idol. I'm Dr. Robin Sargent, owner of idle courses. This is the place where newbies come to learn and veteran share their knowledge
Build a program, are you thinking that you've got to put something extra, like some kind of X API or some kind of other code on that SCORM package, or whatever it is, in order to read the body language, right? Because it seems like, as most, you know, records are not like the SCORM packages. It's just about quizzes. And so I imagine you'd have to, are you looking for certain body language data, before you build your course. So implement that into the kind of data that you collect in the course, before it ever gets loaded? And it ever gets tracked? Oh, yes,
Unknown Speaker 6:25
X API is one of them. Like, that's the most important thing in order to track the digital body language. But also, we do use another program and algorithm which is, which tends towards the learning analytics part. And also, it's basically a combination of both learning analytics and also AI. So it's aI driven, as well as. So if I have to give this in a much more better format. So imagine you have learning analytics as one circle, and in that you have a learner data and the EDM coming in learning analytics. But what happens with analytics that it also kind of overlaps with machine learning or the AI part of it. And when both of them overlap, there's that one small bit in the overlap, which comes as EDM. So EDM not only comes as the statistical data, it also shows you like graphs and charts and stuff like that. Just to understand or, you know, if you are an l&d consultant, and you're building a course, and you're given consultation, performance consulting, so what happens is that when you see those graphs, and when you analyze that data, you will be in a better position to give to give your stakeholders what they want, rather than, you know, you kind of designing something much more intuitively. So that was one use case, but not in London, where I was working previous company, wherein we had a set of field engineers, and all of them, the main job day to day job was troubleshooting. And what they would do on troubleshooting was that they would get on calls, they would kind of, you know, like, for example, if you're a troubleshooting engineer, they would, the customer would give you a call, and then you know, say suppose laptop was not working? How do you how do you fix my laptop, I need to fix my laptop. So that was the kind of, you know, troubleshooting tickets that they would get. And this would be in the form of tickets basically. And what we did was like, again, intuitively, we are learning consultants and instructional designers and content developers got together when he said that great, this is a great opportunity, let's make this content gamified. And we gave them a gamified content, okay, and what the feedback that we got from the stakeholders and the manager was that see, this is not the correct format, which you were expecting, because as of now, we are not trying to solve this problem of getting a game on. But we are trying to solve something else, wherein we want to kind of get the learners to understand what their day to day job is, and also clear off the tickets on a day to day basis. So that's how we figured this problem was that we had something known as the Net Promoter Score. And that score was linked on to a EDM site, and the EDM, basically, the EDM platform or the algorithm that was there. So we saw that there was one, one place where the wherever whenever the field engineer would take the call, he would go and start playing the quest, he or she would start playing the quest of that game. And they would not even complete the first level or first round of it because they were so busy troubleshooting on the call. So what we did was we gathered that data, we gathered that insight and we went back and we told our stakeholders that okay, see, this is totally not working out. All they need is just a job aid and maybe you can give them like a series of micro learnings for three minutes on what exactly Are they doing and what jobs are they doing and how they can go about it, the gamified part of it, you can keep it for the new hires. So when they just come come come out of their, you know, freshly graduated college, and then they're getting into the process training and the job training part of it. So that is where you can give this course, rather than giving them the job ID. So I mean, the result of it was that the number of tickets that got sold for that person or that quarter increase, so that kind of 10, they tended to even, you know, increase in the revenue aspect of it, because the number of tickets that are getting sold are much higher, when compared to what they were doing on a day to day basis.
Robin Sargent 10:40
So even just capturing that net promoter score was enough data, to have you guys change course, and realize what that body language was of the learners. And so we're, I mean, how did you cat capture a net promoter score so early in the course? I mean, how many did you build in? I'm just curious about like, where, you know, where do you structure things like this? Data, that means something? But that's
Unknown Speaker 11:10
actually a very good question. Like, it gets into the details of how exactly or you know, a data behaves. So there is no, the thing with EDM is that there is no structure for for them. So there is no structure as as is. So what we want to do is that how we captured the data, as soon as possible was the kind of drop off rates that we were getting. So if say, for example, the learner takes up the first mission of the course. And then he or she kind of drops off at the second question, or the third question itself. And that was, that was like a repeating pattern, not just for one person, it was a repeating pattern for other people as well. So like, for example, if I'm saying that, you know, this course is taken by 10 people, out of 10 people, six people were dropping off at that question itself, because they were getting tickets in such a way that the kind of troubleshooting that they were doing was on a very monotonous basis, or they had just one issue that they had to deal with. So that is where we kind of figured out that, you know, this, this is where they're dropping off. And also, one thing that we figured was the devices part of it. So they were like people who would take you know, while they were traveling on a metro, and you know, they will take our courses, so they would play the game over there. And then they would finish the entire game. And then they would come back and try to implement the same thing in their day to day job, or, you know, they would do something like that, wherein they would play the game during the coffee break and come back. But the problem was that the retention rate was not as great as what used to happen. So when we saw that they were going back, like twice, thrice and multiple times onto the same, you know, same course, then we analyzed, and he just went on them and asked, like, why are you actually going so many times? Because, of course, is there a problem with the course? Or is it that that you're not able to understand what is happening in the course. So that's when they kind of got back to us saying that, you know, this is not okay with us, because we have to play the entire game while we are talking to a customer. So do we concentrate on the game winning part of it? Or do we concentrate on a customer's call? So that is how we kind of figured out that they would need a job aid, rather than having a game context in there.
Robin Sargent 13:28
That's a great example. Okay, so. So now they are putting places where you are looking at the data. And I've done this before, like, when I did my dissertation, I had to capture data. And I didn't even have any x API, because you can capture things like drop off rates, if they complete things time inside the course, and how many times they were at the course without any extra things. And so for most of the courses, that you're looking for this data mining it, do they all need to have some kind of other features on there to track data. Do you recommend anything else that you're tracking? Like, I don't know, whether people on an interaction or something? What are what are some of the things just from your experience you'd recommend? Yeah, I
Unknown Speaker 14:21
think only thing to capture that data that is required as as of now is Excel. Excel is a very good way of capturing data and also Google Forms. So we use Google Forms as well. You don't have to have like all these x API and all of that. So even if you have a simple Google form, that could also kind of in the responses section, you would get something related to the data there. So again, it again, depends on how you want to do it. So if you have a class of like, say 30 Or you know, manageable, 15 class members that are in there, then I think Google Forms would be a best way to do it. But if you have A course that is that online, like, the one that we were doing was catering to 2500 employees of that batch. And also in the education sector, we were dealing with 500 students in there. So that is where the x API comes into picture, because x API gives you a heat map of what exactly is going on. So even if you do not have an X API, if you have a normal LRS, and if the LRS can visualize the data for you, I think you're good to go with it. Apart from that, if you have your course Id also like you know, if you're searching anything in terms of the courses part of it, I mean, that is also good enough. But I would definitely recommend having, you know, an Excel or Google form if you need to collect data.
Robin Sargent 15:49
Okay, so I'm just so we want to be clear, an LRS is a learning record store, which is a little bit different from an LMS.
Unknown Speaker 15:57
Not little, I mean, usually these days LMS comes with LRS, if you get into any basic LMS, as well, both of them would come in hand in hand, but in case you do not have an LMS, then I think LRS is another next best option to go for.
Robin Sargent 16:16
So you're saying that just gives a heatmap when you put that x API on it, and there's no other specific, specific things that you put on there, unless you specifically design that, I want to capture how many points they earn in this game, in the polls or whatever, okay.
Unknown Speaker 16:33
Right. So also talks about capturing the real time data. Now, for example, if there are like, say, about 10,000 people who have logged into your LMS and are taking the same course. So the real time data over there would show a leaderboard, as such, but then we will be seeing it in terms of scatter plots. So what happens in scatter plots is that so each point on the graph would represent a learner. So if I am, say, student, 22, and your student 23. So what difference would that make? So in student 22, the data that you will be capturing is that like, student 22, on Question Five is spending five minutes, but student 23 on Question Five is spending 15 minutes. Okay, so then there is that difference? Like, why is it that student 23 is a you know, spending so much time and then you're spending just less time, and also in terms of like, say, if you're giving a video, there's another use case, actually, so there was an AI course, that we were kind of developing, and that AI course had like, what, 10 minutes at a stretch six, six videos. So basically, the learner would sit on the LMS, for 60 minutes, that's like one hour, see one hour videos at a stretch each video of 10 or 10 minutes span, what happened was that after every 20 minutes, the learner would drop off. So this will not just this should not like you know, just be you know, pause and then go back and then come in to the same LMS part of it, it would just be like 10 minutes, and then drop off coming back after two to three days. And then again, you know, just spend that, again, 10 minutes drop off and come in after two to three days. So when we figured this kind of pattern that was happening online, what we, what we found was that the learner is not able to grasp the video after the 20th minute because the video had something to do with coding. And for coding, we would require a sandbox environment. And we would require, you know, them to type and do a project sort of thing. So what we do is that we shorten the video length to like seven minutes, which was just the introduction part. And then we gave them a sandbox environment and a DIY project that goes on for the next two weeks, so that at the end of the course, they will be submitting a DIY project. So though, we took a blended approach, rather than having just a self directed approach. So again, this is all calculated on the basis of data. So we just didn't go by our intuition saying that, you know, this is not the right thing to do. Now, I'm going to change it to just interactive learning, and learners would be happy with it. So we kind of designed a learner centric approach by using the data and not by the intuition, or our experience, part of it.
Robin Sargent 19:27
Does that does this thinking about learning data and kind of making changes? On the fly? If you are in the moment? Does it kind of make you think a little bit differently about when it is that you roll a course out to a learner? Because maybe you maybe you do a draft if you know that you're just gonna go and probably make some changes to that maybe you don't build a full game before you release it to them and get some data has that kind of changed the way that you approach it just because of the day Add that you can get in real
Unknown Speaker 20:01
time. Yes, actually, yes, that do that, what we're doing is that you're creating a lot of pilots, rather than just creating one full fledged flow. Of course, we create a lot of pilots before we even get on to creating the course. So, I mean, there was this structure, which are a process that they would follow, when first would be an introduction, then there would be learning objective, and then, you know, you would get a video and then a question. So there was this kind of structure that used to go on in whatever lnd field that I was there in. And then what happened was that when we kind of put this forth, and when he said that, you know, you know, you don't have to have the structure, because most of them do not even spend like one second on certain pages. So you can definitely eliminate those pages and concentrate more on what you what you have to give. So that kind of led us to having a lot of pilots before even, you know, getting into the actual course development. So yeah, I mean, in a way, it was time consuming in the beginning. But once the process got ingrained in everyone, like just to get that data and then have that data feedback sort of thing. Once that was ingrained. I think they got the feel of what they want and how they have to do it.
Robin Sargent 21:16
Okay, so I'm ready to keep going. Smarty. So tell me the next part. I mean, you told me they were going to get married at the end.
Unknown Speaker 21:27
Oh, yeah. So the matting, but is where the, you know, use case of another thing comes in. So if you've seen in most of our courses, like definitely the pilot, in part is one, but how exactly are they gonna get married is that one is, one is definitely how you're going to align your learning objectives with your, you know, business outcomes. So I think that has kind of helped us a lot. It's not just by me telling it or someone else in the LLDP is telling it, it is data, who's telling it, and and what happens when data tells and when you mind those insights is that so there was one case, wherein there was a post that was rolled out. And what had happened was that the since we were a global team, at that time, it was just a dawn of COVID. And then we just shifted out, and everyone were working from home. And what happened was that we rolled out a course I think, and it was purely in English. And there were like countries were there, like Latin America, and there was Africa, Philippines and China, China, China as well. Yeah. So what happened was that we rolled out English, it was a global course. So we rolled it out, saying that, you know, English is global, and then we went out into, you know, rolling it out like that. So then what happened was that the heatmap that I was talking about was where we got from our localizations. So there was a lot of drop, and no one actually took the course at all, in the end from all those places, and the investment was such that we made made something in dollars, like 1000s of dollars, and the investment to build that course, was so much that, you know, everyone had to take that course, in order to clear that and get into the process training. So we had a certificate setup and stuff like that. So since it was English, no one in that areas or those areas to cut those courses. So that's where in the first quarter, we observed that there was a drastic drop in those areas, and then only just like, you know, parts of us parts of India, I think parts of Russia as well took up those courses. But then everywhere else like China, Philippines, the eastern part of it didn't take, and even the Latin America, part of it didn't take the courses, just because of the fact that they were not comfortable with the kind of English that we were talking and also the kind of voiceover that was provided in the course. So this is this, this was something that was figured out from the heatmap. And by this mining those data points on that heat map, heat map. So we saw that since the increase was there, and us and all of that it's all global, close, at least we're supposed to see some sort of green spots in each of the areas which we were not able to see. So that's where we kind of mostly invested in the localization bit and also on customizing it to the local, you know, area so that we we kind of communicate what we want to tell them. And also, it's not only that it kind of forms this knowledge society as such. So learning doesn't have a language and learning happens almost continuously. And that was also something that we were able to prove through the EDM part of it. And also what we analyzed in the end was that I mean, people do contradict this and we had a lot of contradictions that came in when we said that, you know, this is how it is happening. After the use case of the heat map that I was just talking about. We also figured out that higher the effort you put in, the amount of learning is less. So the more effort you put the Learning was less, because because the the course what we just sent out like for imagine, I think they had like eight quizzes in that course. And quiz one to seven or plus one to five was also okay, they kind of had that average score, everyone had the average score. And then after five to seven, this course kind of dropped down.
Robin Sargent 25:24
Unknown Speaker 25:28
And then when we went back and we checked, like, Why exactly is this dropping off, we saw that we had given them a DIY project at each of those quizzes. So they had basically, according section of it in the end, so even though we had localized it, everything was as per what they needed. So even though that was done, there was a coding section of it, which required the learner to spend at least 15 to 20 minutes in a day. And that means effort from the learners, and just to be, you know, in touch with whatever they have to do. So that is where we figured out that you know, okay, this effort is not working. And also we could apply this to our IoT trainings, wherein they had like straight eight hours of IoT on Zoom sessions, like watch it is on zoom session. So even though they were working hard to do that, and then, you know, learning the textbooks and all of that, at the end of the day, the certification was not showing it, and also their, you know, the passing rate wasn't something that passing score was something that was that was not been shown over there. So this is something that we were able to determine. And when we went back to the management, saying that see, this is, this is the issue, I think we may have to reduce on the, you know, effort part of it and don't spend a lot of money, we did get a pushback saying that, you know, no, no, I think we have to keep that eight hours straight. Otherwise, how do we know someone's actually learning or not, that's where the entire thing of EDM came in. And then we had to prove it to them through data saying that this is how it goes. And then this is what we propose, as an easier solution to talk about,
Robin Sargent 27:01
I love that you can have a conversation with a business that disagrees with you about how training should be delivered. And it's not an argument. It's just what the data says, It's not me, it's not my opinion, or ego, or any of that kind of stuff. It's just like, well, the data tells us and that's a lot of there's an easier conversation to have with the stakeholders and the subject matter experts, whoever is in charge of that project by just showing data. So alright, so tell me more. So you. So it's, you want to think about the business objectives, you are then of course, aligning your learning objectives with these business objectives. And then also, I imagine, you're going to attach places that you grab data to see if they are meeting these learning objectives that are aligned with the business objectives. And then you're throwing it out in pilots to these learners and seeing like, basically, like, what sticks, where those issues, get, there are some places where maybe a better needs analysis could have, you know, prevented or predicted some of the things that you find with the data,
Unknown Speaker 28:23
that's actually a really good thing. But the thing with needs analysis is that even though we conduct the needs analysis, especially in a corporate environment, we do not have the direct interaction with the customer. So we will be catering to teams. So if the team lead says that, you know, you have to get it done in this way, then we get it done in that way. So there's, there's no, there is no other thing that we are supposed to do, or we cannot say that, you know, this is not the right way to do it. Or, you know, even if you try consulting them, saying that, you know, you know that my experience, this would be a bit bit more interactive approach, like no, I don't think so. Because this is I know, my team, I know how they are. So they want it this way. So I think in that way, that navigation, part of it becomes a little bit more easier, rather than sitting on the needs analysis call, and then giving them consultations on how we are going to do it. And also in terms of the needs analysis, even if you're talking to the learners. So as of now, with the COVID part of it since we have just gone online, so even the, you know, activity happens through the computer part of it. So even if the learner comes and tells you that, you know, this is something that I don't want, you just cannot cater to one learner, you have to cater to like 10 other learners who are also taking that course and who says that, you know, this is a problem with it, and this is the problem that needs to be fixed. So I think that kind of, you know, analysis part of it, you know, kind of helped us just not in terms of, you know, navigating through the learner needs or you know, how the corporate learners think also to understand Learn what the Online Learner behaviors are. So we know that there are like different types of learners, I think there are eight, if I'm not wrong, eight different types of learners. But if you just come to the online scenario, those eight have been, you know, come down to like five, five broad categories of online learners. And one of the major things with online learning is that there are people who, who are learners who are like, you know, okay, this course is not for me, so they take up the first course, and then they go through like two modules, and they're like, Ah, this is so boring, let me move on to the next one. And then they just keep on harping on from one course to another post, which kind of becomes really difficult to track at the end of the day. So there will be certain percentage of that, and there will also be certain percentage of learners who would really sit through the course and get through the entire quiz part of it. So what we're doing here is that we're kind of giving them a universal solution as to what can be avoided, rather than just deciding it during the needs analysis part, and then you know, going about the entire development. So there is a cycle that it follows. And that title can be reversed as well. So basically, when you say that, you know, you're going to design so after designing, you're gonna monitor from monitor your reporting, and in the reporting section, you gain insight, and then you go back to Design again. So from design, you again, get back to the reporting part, and we say that, you know, this, this is somewhere not working, and we change our strategy, or, you know, we put in something extra or, you know, delete something, so that, you know, there is a difference between nice to know and need to know, sort of a thing. So I think all of these are certain things that could not be captured during the needs analysis part. And definitely the data would provide better insight into what exactly the, you know, course would look like, at the end of the day,
Robin Sargent 31:47
I think it's so fair, some are detailed talk about, there are many times in corporate, workplace learning, training, whatever we're going to call it, where you really, your hands are tied, I mean, you really are going to be the order taker, you are going to do it their way. And they're, and you can't win with arguments, right? We, I mean, we just know that about human nature, because once you start arguing with somebody will now their egos on the line, they have to defend their position, and you're never going to win that way. And so, you know, just having a vehicle to avoid argument, and still get that iterative process in there, and then have real facts, where you are now showing the way instead of telling the way, I think that I mean, besides all the wonderful things that you know, reading this learner body language, and data would give us that, that alone is a pretty neat feature. So tell me more about. So you, I didn't even realize but when you start talking about Smarty bow, there are different types of online learners, right, like the kinds of like, just like skip around, and I was even trying to think about like the kind that I am. I'm definitely a like, skip around, type of learner. But I'm curious. So you said there are five, can you think of a couple more just out of my own curiosity, what those are about the the typical patterns,
Unknown Speaker 33:16
I don't know if I can show the heatmap in here. So if possible, I can share that with you on the heatmap, or the learner model for it, what we figured out was that the model, so if you consider green, red and green, so to say that, so there are a set of learners who are also visually impaired in there, which is something that we would not be able to figure out from just the normal corporate scenario. So the way it would interact was that he did have a place where there was like this eyeball tracking part. So if you're an, you know, exploratory learner, the eyeballs would the right portions, that's the exploratory learner would go from one image onto another. So you would immediately see that time frame, jump from one image to another image, or you know, from one place to another place, you would see that fast interaction with the entire place of it on the entire screen. But if you are, like, say, a normal person or you know, you like to go something in depth, or, you know, understand how exactly the course is, there would be one pattern that is formed in one single line over there. So it's just that from top to bottom, there would just be one portion of the screen that would be covered. And then later on, when you move on to the next portion of the screen, only that portion of the screen is something that would be covered. So again, the time gap between both of them would be different, and also would be different for different people who are coming in over there. But then that was such an interesting pattern to find out like, you know, there's one person who is just seeing this pattern over here. And there are like people who go through everywhere on the screen and then you know, you know how exactly it's working and what is doing as well. So also this kind of gave To understand one more thing known as sentiment analysis, so we would kind of grab the sentiment in there. So if if I say that I put a put a course, like compliance course, and a normal, like, say, Java course. So the compliance course would basically have like, say, you know, a family or you know, some some sort of thing like that when there's danger that shown or an InfoSec, cause that will show like security and, you know, stuff like that. So what had happened is that immediately, that was the thing that would grab the attention. So it will not be this Java course that will grab so to be that security and InfoSec, that will grab the attention, because we will literally see their pattern going like this. So the eyeball tracking machine showed us this image, rather than showing the side or the side image. So I think that way, that was one way of, you know, checking what pattern and what image that can be used and what can be changed. Also, on the course,
Robin Sargent 35:57
do you guys ever do? You know, like, if we're talking about treating our learners like customers, Do you guys ever look at a B testing, like, do a course one way and do like, maybe just change one small thing and a second version of that pilot, just to see what the different data is? Do you guys do anything like that? Or is that just too time consuming or
Unknown Speaker 36:21
so if it asks if the course is not in a rush again, if you're doing, you know, once the course is done, so we do this on a post, of course manner. So we do not do it on the online course. But we do it on the post course. So wherein we take that course, and we do the AV testing for it. So like if we use this strategy, how it would work. And if we use this strategy, how it would work, sort of thing. But our strategy would come like the EDM part of it would come mostly into picture on real time itself. So we would just give them the real time data and not the AV testing part because it comes under, you know, a bit more time consuming, because as of now, they just needed the real time data rather than, you know, seeing what exactly how they're learning. But research wise, yes, we will do it post once the course was like, Okay,
Robin Sargent 37:11
and so, now we're talking about so now you have you've put your course up there. And you are, you know, it's rolled out to the students and you're starting to look for, you know, heat maps, you've said so far, the eyeball trackers, what are some of those other things that you look for? Like what's like your checklist of things that you're looking for when you're doing your data mining?
Unknown Speaker 37:33
I think one is the browser usage. So which browser are they using? And also, the second thing which we will be using is, what kind of courses are they seem? So if there are like a bunch of courses up there. So what kind of courses are they seeing, definitely the dropout rates, quiz, quiz data is something that we would definitely be looking into. And also in terms of the eyeball tracking part of it, and the sentiment analysis part of it, we have that like comment, and also the share part of it. So how many likes goes in and how many dislikes come in. And also, we do have like micro interactions, which come on the course, like there's a star that comes up. So once you're done with the course, there's a star that comes up through the LMS itself. So that micro interactions, like how many stars were collected, like how many, you know, badges were collected in there. So it's not part of badges as such. So usually it's a star, which we do and how many points were collected, as well, that is something that we track. But also for the quiz part of it, we do have the step by step tracking as to how they go on the on the screen. So if they're moving around from quiz question, one to question three directly. What is wrong with question two? And if all students are doing that, like what is wrong with question two, then there's something wrong with the interaction. And if you say that, you know, that, you know, skipping all the questions indirectly going on to the, you know, final results slide of it, like, was the question so easy? Or was it like do we have to make it more tough and also in terms of like, the quizzes and the analysis part of it. So, at the end of the day, when we collect those reports, that is where we do the analysis, and we show it to our stakeholders are saying that this is what we have analyzed, and this is how it is, but most of our real time tracking happens with our gamified content, and how they react to and also we do have those competency matching that happens in there. So if there are like certain competencies are you know, due to some skill gap analysis that we create, and then you know, if those gap analysis show you that, you know, certain competencies are not matching, so then we have a rubric that is plugged into the system, and that rubric is something that would calculate the competencies and give you on what exactly the employee is lacking. Or you know, what exactly can be done to match those skill gaps.
Robin Sargent 39:53
Interesting. Okay. So most of our audience, they are new instructional designers or maybe they are transitioning into our field. And so I love what you've been talking about Smar T, I think that it's essential to especially, you know, innovation and where we're going and how we can grow in our industry to track and read our data better. And even to just think about the data that you're gonna get at the end from the beginning. So what are some of those things that you would tell new instructional designers or people just getting in our field, about how they can start thinking about data and what kind of data they really want to focus on as a new instructional designer? Hmm,
Unknown Speaker 40:39
I think, as a new instructional designer, you may want to see the drop off, I think that's a very good way to start off, and also the data that they can collect, or also in terms of the visual design of what they're doing, because since you're doing a lot of elearning. So I think the visual design aspect of it also comes into picture. So where the next button is placed, where the transcript button is placed, where the Settings button is placed. So I think those kinds of data, the UI UX data, so that is something also that they can collect, because there are times when, you know, learners, you know, you can't give them whatever they need. But you definitely could understand a problem as to why exactly, are they not taking the course? Or why exactly are they not proceeding further. So I think that is one another thing that we can figure out as a new person, but then the most important thing, I think, would be to figure out if localization is needed or not, or if customization is needed or not. Because if we just kind of go ahead and create a global English course. And then you know, even though if it is good or bad, or whatever it is, if it is not suiting the audiences to certain areas with us going online, as of now, then I think that you would see a lot of dropouts, rather than, you know, seeing people who would be enabling, you know, stuff for your course. So I think those are certain things that would be pretty much important.
Robin Sargent 42:09
How much of a pilot would you put up there to just get data and feedback before you build the full thing? Like when you guys do pilots? How, how much of a draft, are they how rough are they before you put them up there and start tracking data?
Unknown Speaker 42:26
Okay, so what we do is that we kind of divide our learners. So if there's a project that's coming in, so we definitely do divide our learners into batches, and each batch is going to have a different pilot. So with each prototype that we send out for each batch, so if we say that we have like three batches, so each batch will have a different thing, or there will be some feature that would have would be put in or some feature that will be eliminated seeing on the batch 100123. So the final product, like you know, which caters to everything would I think would be at the bat zero part of it. But apart from after bat zero is something that we do not, you know, get into the pirating part, but we get into straight onto the live courses. So this is I think, the pilot that we follow, but then by the time we get out to bat to the kind of target audiences we are catering is also pretty different. So in that zero, we will be catering to an experienced person, or like completely experienced or like, you know, say 15 years into the process, or 15 years into the job. And that is where we put in our back zero. And then like batch one has a little lower level like was like, you know, four or five years of experience. And then a batch two would be like just with two years experience. And back three would be like completely new person who comes on to the field. So that's how we kind of do our testing as well to collect the
Robin Sargent 43:53
data. Oh, interesting. And then you kind of based on that data, you can create a course that kind of serves all the different roles, or do you still keep them separate?
Unknown Speaker 44:02
Oh, no, we build a course which caters to all of them.
Robin Sargent 44:07
So that's fascinating. Okay. All right. So what is your I just, I just love so much of what you shared, I think that you've probably, you know, sparked so much curiosity than those that have come to listen to you today. Smarty. So, what was your best and final advice for new instructional designers? Before we we end our podcast,
Unknown Speaker 44:29
I think is that don't listen to what you think is right. And just listen to what your data is talking and what the learners are giving a feedback, because that can be a great data point as to how you can design your future courses and also kind of attends to an experience sort of a thing. Like you know, just listening to that data and you know, not listening to what you feel is right. So I think that would give a better solution or logical solution rather than us
Unknown Speaker 45:46
see that on my website as well.
Robin Sargent 45:48
And what's your website? We'll put in the show notes, but just say it.
Unknown Speaker 45:52
Oh, yeah, it's a big site. So it's my portfolio, but I can definitely leave thing that on the notes. Perfect.
Robin Sargent 46:00
Wonderful. We can't wait to read more from yesterday. Thank you so much for joining me today. I really, really appreciate your time.
Unknown Speaker 46:07
Yeah, thank you. Thank you so much.
Robin Sargent 46:09
Thank you so much for listening. You can find the show notes for this episode at idle courses.com. If you liked this podcast and you want to become an instructional designer, and online learning developer, join me in the idle courses Academy, where you'll learn to build all the assets you need to land your first instructional design job, early access to this podcasts, tutorials for how to use the elearning authoring tools, templates for everything course building and paid instructional design experience opportunities, go to idle courses.com forward slash Academy and enroll or get on the waitlist. Now get out there and build transcendent fortune
Transcribed by https://otter.ai