I originally posted the below comment in response to an article on Wired regarding the future of MOOCs:
MOOCs have their merits, including giving free access to world-class education to students who for any reason (affordability, competitive ability, opportunity cost) cannot reach top-tier institutions. For students who are already pursuing a degree, they can help you explore a wider range of topics than a traditional university degree encompasses.
However, I should take a step back first. Firstly, I believe that the traditional university model is broken. Degrees are inflexible, require too much time commitment, cost too much, and the nature of the job market means that the skills learnt are of little real world value. Why would I spend 6-7 years pursuing an undergrad and grad course spending close to 200,000 thousand dollars (when I could save that money or use it to start my venture)? Because jobs require "credentials" as an eligibility criteria, even though those very credentials hold little real world application value.
Since I am an IT student, I believe that industry-relevant courses with a project-oriented focus, internships and and other modes of self-learning is the way to go ahead. Udacity does offer something close to it, yet they still have to have more in-depth courses with opportunities for internships with innovative companies.
Coming back to university-style MOOCs like Edx and Coursera, I think they are missing the point. The completion rate is low because a large percentage of their students are already pursuing degrees and have limited time and incentive to complete entire courses. From a learner's perspective, MOOCs are best taken in conjunction with books, interns, and projects. With so many resources, it is easy to see why the statistics are as they stand.
I believe the way forward is for MOOC providers to clarify their goals. Are they going to be just replicating traditional university courses on a larger scale, or are they going to add more value than traditional degree courses at universities by doing things differently?
Regarding the free vs paid debate, I think offering different modes for learners with different objectives (as most MOOC providers do) is a good solution. Other aspects of MOOCs need to worked out. I still am hopeful for the future of MOOCs.
In response to a comment that degrees teach skills that are applied everyday on the job:
We learn programming 'concepts' and networking 'fundamentals' while what we should be learning is applying those concepts and fundamentals into building real-world projects. Concepts can be picked up from a book and practiced in a lab as part of on-the-job training/apprenticeship.
When I go to an IT company for a job, they ask for projects but they require a basic degree as an eligibility criteria. Now this 'basic' degree is quite expensive in US, but the salaries offered will be the local average back home. This very degree should have several real-world (not academic-oriented) projects done as part of curriculum. But it does not.
Maybe a conventional answer would be - the real learning starts on the job, do a masters degree for specialization(which requires both time and money) I hope there are better answers to this problem.
So whats the solution? Project-oriented curriculum. Compulsory internships/ apprenticeship/OJT that do not require applications, rather these should be embedded in the curriculum for everyone. Flipped classroom approach (facilitated by online courses).
Another alternative is: join a job first, and get training as per job requirements. Either cut down the cost drastically for attending universities or cut it out together and join an apprenticeship.
MOOCs have their merits, including giving free access to world-class education to students who for any reason (affordability, competitive ability, opportunity cost) cannot reach top-tier institutions. For students who are already pursuing a degree, they can help you explore a wider range of topics than a traditional university degree encompasses.
However, I should take a step back first. Firstly, I believe that the traditional university model is broken. Degrees are inflexible, require too much time commitment, cost too much, and the nature of the job market means that the skills learnt are of little real world value. Why would I spend 6-7 years pursuing an undergrad and grad course spending close to 200,000 thousand dollars (when I could save that money or use it to start my venture)? Because jobs require "credentials" as an eligibility criteria, even though those very credentials hold little real world application value.
Since I am an IT student, I believe that industry-relevant courses with a project-oriented focus, internships and and other modes of self-learning is the way to go ahead. Udacity does offer something close to it, yet they still have to have more in-depth courses with opportunities for internships with innovative companies.
Coming back to university-style MOOCs like Edx and Coursera, I think they are missing the point. The completion rate is low because a large percentage of their students are already pursuing degrees and have limited time and incentive to complete entire courses. From a learner's perspective, MOOCs are best taken in conjunction with books, interns, and projects. With so many resources, it is easy to see why the statistics are as they stand.
I believe the way forward is for MOOC providers to clarify their goals. Are they going to be just replicating traditional university courses on a larger scale, or are they going to add more value than traditional degree courses at universities by doing things differently?
Regarding the free vs paid debate, I think offering different modes for learners with different objectives (as most MOOC providers do) is a good solution. Other aspects of MOOCs need to worked out. I still am hopeful for the future of MOOCs.
In response to a comment that degrees teach skills that are applied everyday on the job:
We learn programming 'concepts' and networking 'fundamentals' while what we should be learning is applying those concepts and fundamentals into building real-world projects. Concepts can be picked up from a book and practiced in a lab as part of on-the-job training/apprenticeship.
When I go to an IT company for a job, they ask for projects but they require a basic degree as an eligibility criteria. Now this 'basic' degree is quite expensive in US, but the salaries offered will be the local average back home. This very degree should have several real-world (not academic-oriented) projects done as part of curriculum. But it does not.
Maybe a conventional answer would be - the real learning starts on the job, do a masters degree for specialization(which requires both time and money) I hope there are better answers to this problem.
So whats the solution? Project-oriented curriculum. Compulsory internships/ apprenticeship/OJT that do not require applications, rather these should be embedded in the curriculum for everyone. Flipped classroom approach (facilitated by online courses).
Another alternative is: join a job first, and get training as per job requirements. Either cut down the cost drastically for attending universities or cut it out together and join an apprenticeship.
Comments
Post a Comment