• Accessibility Tools
  • Current Students
  • Press Office
  • News and Events

New review says ineffective ‘learning styles’ theory persists in education around the world

  • News Archive
  • Information for Journalists
  • Information for Staff

A new review by Swansea University reveals there is widespread belief around the world in a teaching method that is not only ineffective but may actually be harmful to learners.

For decades educators have been advised to match their teaching to the supposed ‘learning styles’ of students. There are more than 70 different classification systems, but the most well-known (VARK) sees individuals being categorised as visual, auditory, read-write or kinesthetic learners.

However, a new paper by Professor Phil Newton, of Swansea University Medical School, highlights that this ineffective approach is still believed by teachers and calls for a more evidence-based approach to teacher-training.

He explained that various reviews, carried out since the mid-2000s, have concluded there is no evidence to support the idea that matching instructional methods to the supposed learning style of a student does improve learning.

Professor Newton said: “This apparent widespread belief in an ineffective teaching method that is also potentially harmful has caused concern among the education community.”

His review, carried out with Swansea University student Atharva Salvi , found a substantial majority of educators, almost 90 per cent, from samples all over the world in all types of education, reported that they believe in the efficacy of learning styles

But the study points out that a learner could be a risk of being pigeonholed and consequently lose their motivation as a result.

He said: “For example, a student categorized as an auditory learner may end up thinking there is no point in pursuing studies in visual subjects such as art, or written subjects like journalism and then be demotivated during those classes..”

An additional concern is the creation of unwarranted and unrealistic expectations among educators.

Professor Newton said: “If students do not achieve the academic grades they expect, or do not enjoy their learning; if students are not taught in a way that matches their supposed learning style, then they may attribute these negative experiences to a lack of matching and be further demotivated for future study.”

He added: “Spending time trying to match a student to a learning style could be a waste of valuable time and resources.”

The paper points out that there are many other teaching methods which demonstrably promote learning and are simple and easy to learn, such as use of practice tests, or the spacing of instruction, and it would be better to focus on promoting them instead.

In the paper, published in journal Frontiers in Education , the researchers detail how they conducted a review of relevant studies to see if the data does suggest there is confusion.

They found 89.1 per cent of 15,045 educators believed that individuals learn better when they receive information in their preferred learning style.

He said: “Perhaps the most concerning finding is that there is no evidence that this belief is decreasing.”

Professor Newton suggests history is repeating itself: “If educators are themselves screened using learning styles instruments as students then it seems reasonable that they would then enter teacher-training with a view that the use of learning styles is a good thing, and so the cycle of belief would be self-perpetuating.”

The study concludes that belief in matching instruction to learning styles is remains high.

He said: “There is no sign that this is declining, despite many years of work, in the academic literature and popular press, highlighting this lack of evidence.

However, he also cautioned against over-reaction to the data, much of which was derived from studies where it may not be clear that educators were asked about specific learning styles instruments, rather than individual preferences for learning or other interpretations of the theory.

“To understand this fully, future work should focus on the objective behaviour of educators. How many of us actually match instruction to the individual learning styles of students, and what are the consequences when we do? Should we instead focus on promoting effective approaches rather than debunking myths?”

Share Story

APS

Learning Styles Debunked: There is No Evidence Supporting Auditory and Visual Learning, Psychologists Say

  • Auditory Perception
  • Child Development
  • Cognitive Psychology
  • Learning Styles
  • Methodology
  • Psychological Science in the Public Interest
  • Visual Perception

recent research on learning styles

Are you a verbal learner or a visual learner? Chances are, you’ve pegged yourself or your children as either one or the other and rely on study techniques that suit your individual learning needs. And you’re not alone— for more than 30 years, the notion that teaching methods should match a student’s particular learning style has exerted a powerful influence on education. The long-standing popularity of the learning styles movement has in turn created a thriving commercial market amongst researchers, educators, and the general public.

The wide appeal of the idea that some students will learn better when material is presented visually and that others will learn better when the material is presented verbally, or even in some other way, is evident in the vast number of learning-style tests and teaching guides available for purchase and used in schools. But does scientific research really support the existence of different learning styles, or the hypothesis that people learn better when taught in a way that matches their own unique style?

Unfortunately, the answer is no, according to a major report published in Psychological Science in the Public Interest , a journal of the Association for Psychological Science. The report, authored by a team of eminent researchers in the psychology of learning—Hal Pashler (University of San Diego), Mark McDaniel (Washington University in St. Louis), Doug Rohrer (University of South Florida), and Robert Bjork (University of California, Los Angeles)—reviews the existing literature on learning styles and finds that although numerous studies have purported to show the existence of different kinds of learners (such as “auditory learners” and “visual learners”), those studies have not used the type of randomized research designs  that would make their findings credible.

Nearly all of the studies that purport to provide evidence for learning styles fail to satisfy key criteria for scientific validity. Any experiment designed to test the learning-styles hypothesis would need to classify learners into categories and then randomly assign the learners to use one of several different learning methods, and the participants would need to take the same test at the end of the experiment. If there is truth to the idea that learning styles and teaching styles should mesh, then learners with a given style, say visual-spatial, should learn better with instruction that meshes with that style. The authors found that of the very large number of studies claiming to support the learning-styles hypothesis, very few used this type of research design.  Of those that did, some provided evidence flatly contradictory to this meshing hypothesis, and the few findings in line with the meshing idea did not assess popular learning-style schemes.

No less than 71 different models of learning styles have been proposed over the years. Most have no doubt been created with students’ best interests in mind, and to create more suitable environments for learning. But psychological research has not found that people learn differently, at least not in the ways learning-styles proponents claim. Given the lack of scientific evidence, the authors argue that the currently widespread use of learning-style tests and teaching tools is a wasteful use of limited educational resources.

' src=

Could you please direct me to the source material for this? Thank you.

' src=

I found the study here: https://www.psychologicalscience.org/journals/pspi/PSPI_9_3.pdf

' src=

The study is here: http://www.psychologicalscience.org/journals/pspi/PSPI_9_3.pdf

' src=

I doubt a valid study could be created. There are too many variables. I expect we learn by a combination of all inputs. How could a study overcome the issues of quality of the teachers’ presentation, quality of visuals used compared to quality of auditory materials?

' src=

Larry, speaking as a statistics student, I’ll propose an answer to the issue of how a “valid study” can be designed. Feel free to call me out if there is an inherent flaw with my proposal.

I will be referring to American students specifically since this is an issue debated for the American school system. I assume the author is talking about the same thing, but I’ll admit I don’t know if this teaching idea is prevalent in other countries. For the sake of this argument, it really doesn’t matter anyways as this variable is easily changed.

The sample is the most difficult part here, I expect there to be a lot of chosen students who’s parents do not wish their children to be a part of the study for some reason or another. It would also have to be conducted locally, or over a short period of time, though doing it locally would have a greater chance of acceptance among chosen participants. The greatest effort should be made to account for demographics, but, again, this would be difficult. (^Not a great way to start, apologies, but I’m sure a seasoned statistician could come up with the solution that I’m afraid I can’t)

Now, you have your grouping of students, say 1,500 for a reasonable number that would provide relatively a relatively small margin of error. Split each of these students into groups of 500, and assign them to a 25 student-per-teacher classroom that each taught only through auditory, visual, or “hands-on” learning. The students are specifically instructed not to take notes. For this example, let’s say they are learning the properties of liquids. The visual classes are taught through packets that each student is given. The “hands-on” class is given a sheet instructing them how to perform a lab and giving them blanks to fill in. Obviously, for this one, a teacher will tell them how to properly handle equipment and said equipment will be protected against the children hurting themselves inadvertently.(ie, no bunsen burners, but maybe a low-heat burner with students only able to turn it on/off and not touch the hot surface) The hearing group will be given a lecture on the subject, with questions being allowed afterward. After a few days learning this way, every student in every class would be given the same test. Then they would all switch, this time learning about the properties of a solid through the same methods, before being tested on it. Lastly, they would switch to learning and testing on the properties of a gas. As a control, through the same selection process, 500 students could be selected to be taught using all three of the described methods in the same timeframe. That is, instead of a packet, a lecture, or a lab, they could receive a lecture while being shown a powerpoint, followed by a lab.

To prevent previous learning bias, I would suggest all students in the sampling population be the same age, while having not received formal education prior. Also, every student should be taught to use the equipment before the experiment so that the “hands-on” group wouldn’t be at an initial disadvantage.

I’m not a teacher, a psychologist, or a professional statistician. This is just my proposal using my current knowledge of statistics. Take it with a grain of salt and form your own opinions, this is simply being put forth in the effort to show that such an experiment seems to be viable given the proper infrastructure and coordination.

' src=

Of course, your method makes sense, but it borders on unethical because it is wrong to teach a child anything in a way that they will not understand. I’m not saying you are unethical, but that any scheme that teaches inappropriately (“don’t take notes”) for more than 5 minutes is unethical.

' src=

What a bunch of arrogant people to think that they know if there exists one learning style…!? The only learning style we know is the one in our head. How can you say that there is no other creative ways of learning? What about Autistic people? What about Blind people? What about Deaf people? And Bipolar people? And what about Dyslexic people? And people who have a part of verbal speech comprehensions damages in their brain???? Why give so much importance to a little psychology paper? Any body can do a 3 year psychology degree and then write a paper claiming blabla bla

' src=

That’s not what they’re saying at all. They’re saying that there are no categories, or boxes, that people can be put in based on their learning style. They’re not saying there is just one way to learn. No need to get so worked up. People with damage to specific parts of their brains or sensory organs are obviously the outlier. Obviously they are going to be radically different.

And publishing a paper in an esteemed journal takes a _little_ bit more than a 3 year BSc in psychology. It’s that comment that really reveales the depth of your ignorance.

' src=

As someone diagnosed with high-functioning autism and currently in a concurrent education course, it is much more dangerous to tell someone they should be okay with only learning in one way rather than teaching them to be flexible and learn to absorb information from all sorts of mediums. So I’m gonna assume you’re blind, dyslexic, and autistic because you’ve assumed you can speak for all of them, yes? Your example of someone being blind also helps to further disprove learning theory — which implies nature over nurture — because clearly the ‘visual’ learners who are rendered blind must learn to learn in a different way (which statistically is shown to affect their learning no differently).

' src=

…SOMEBODY doesn’t at all understand the scientific method, reasoning or science in general.

' src=

I hope that we can finally move past these always dubious “sensory” learning styles. They’re really “modes,” different ways of learning. I’ve long argued that anyone who feels weak at using any of them needs to practice using that mode more, not less. But another old branch of learning styles based on differing neurotransmitter biases seemed to have better prospects, even if I’ve seen little done with it for decades now. I hope we don’t toss out the entire learning style baby with the dirty “sensory style” bathwater. With our updated technology, we could probably go much farther with it. For background, see dated and rather poorly written but better reasoned explanatory work by Jane Gear.

' src=

“I’ve long argued that anyone who feels weak at using any of them needs to practice using that mode more, not less.” As a kid already struggling through school with learning disabilities and the resulting long term stress and exhaustion the last thing I needed was to make things more difficult.

' src=

Allow me to state categorically that there are learning styles of which to speak specific to learners. To get the issue on hand, the methods proposed by these researchers as a way to disregard the widespread validity or to invalidate the validity of learning proclivities as a concept is not only inapposite, but also akin to saying that every learner approaches the universe of learning in the exact same way. If that is the measure of what we are to agree on as what constitutes scientific efficacy on any issue, then all forms of research are mitigatable and a suspect in the sense of their nature, methods, outcomes, and overall usefulness.

Such a view to research pieces is clearly misguided, ill-informed and half-scientific … even from a commonsense perspective. It serves no social and scientific utility, but for the interest of the investigators.

Mind you, we are not referring to the efficacy of styles presumptive of or correlative to bettering grade acquisition; rather that it should be argued that there are humane, less torturous, comfortable, less arduous and even naturalistic way of teaching students by emphasizing their uniquely preferred styles, wherever determinable.

Even where indeterminable, instructors are to be encouraged to vary their teaching methods to accommodate the learning needs of their captive audience, in this case, their students, and especially not to think that students learn essentially in the very same way as, for example, their instructors.

To think that all learners learn the same way whether in styles or approaches and to even suppose that instruction is a form of a “straight-jacket” and should work with all “body sizes” is in itself a form of miseducation, misrepresentation and,or a type of stiff recalcitrance that should not ever conduce to the mind of an educator, much less a group of psychologists.

Conbach and Snow’s [in the 60’s] work on learning differences, along with findings affecting Trait/Factor analysis are some of few materials that may well serve as enviable pivots for the current exchange.

' src=

When it comes to research concerning learning styles…the human dynamics of learning is so complex that attempting to isolate independent variables that may affect learning is like trying to determine the direction of an automobile by studying petroleum chemistry.

' src=

The big problem of understanding this is that people don’t focus on the clear and precise language being used, and don’t understand how experimental science works.

What is being said is that “learning styles” theories which denote specific “auditory” and “visual” learning styles do not have any scientific evidence for them. Those who are evaluated to be predominantly “auditory” in terms of a “learning style” do not in fact perform better or differently when taught “visually” and vice versa.

This is important, because while it seems intuitively true that some people might learn better with a specific medium, there is no evidence for it. What there is evidence for is the superiority of multi-modal or multi-media instruction, in terms of learning outcomes.

The main point is don’t waste time on something that has no evidence to support it. See a ranking of effect size on educational reforms to see what is most important, and what is least: https://visible-learning.org/hattie-ranking-influences-effect-sizes-learning-achievement/

' src=

I am currently studying to be an ESL teacher and have come across these “learning styles” with in the course. I do have a rather concerning view about them.

I can see that many minds are put behind how we are going to teach and get the “message” across to learners, but sadly i feel like there is an overdose of ego on “who has the better way to teach”. I know that’s a pretty heavy assumption but i can’t conclude much else except maybe there is a fear that the future generation may not learn correctly, which if this is the case, this manifests into over thinking techniques and deviding the way how individuals learn. I do however believe that segregating ways in which people learn is crazy and an over analysed attempt.

As i was studying this i couldn’t help but scrunch my nose in confusion when alot of the individual “learning styles” were something that i have as a “whole” and as an “individual”. I strongly believe that everything works hand in hand.

If i was to simply hold up a picture of someone playing golf and not attach a word or action to it, they would simply know what it looks like but not know what to call it or how it works. Auditory and kinaesthetic would be eliminated and the student will be deprived. But what concerns me is, that i would be compelled to put action to something like this(in a teachers mind) and tell them what we call it (golf). So to be segregating “learning styles” you must be going against a law within your conscience as to how we ALL “learn” this seriously is a no brainer for me.

I must say though not everything is based on science, simply using your brain can solve many of complications. I say that encouragingly not as a rivalry. Hope this was helpful.

Teaching golf would integrate all 4 learning styles. Why not use kinesthetic methods to complement visual, auditory, and logical when appropriate?

' src=

Howdy folks! I’d like to understand this a little better. So these guys invalidated all of the studies because they didn’t meet their standard and for that reason they declare that everyone has the same learning style? Is that what they are saying? I don’t see that they setup and carried out a scientific study that meets their own criteria to prove their hypothesis that we all the same learning style. Did I miss something there? Let’s just say the science wasn’t good enough as they say, then that only means that the science hasn’t proved anything. If the science isn’t good enough to prove it right…. then I’m thinking that it doesn’t prove it wrong either. Wouldn’t that just mean that the hypothesis just remains unproven? I wonder too if someone can explain what learning style I’m using when I’m learning how to play my drums? So I’m trying to learn a double stroke roll and feeling the stick bounce and snapping my fingers and wrist at the right moment… it’s all about the feel. To me that’s my kinesthetic learning channel. I’m programming my “muscle memory” is yet another frame for explaining it. Does their conclusion invalidate this learning channel? When it comes to learning songs, I listen by sound. I listen and repeat. I have friends who can only play along with sheet music. They read and play. I didn’t carry out a study to figure this out. I just talk to other drummers and there’s clearly 2 sets of learning styles right there. Many drummers can only sight read. I can’t. I ask then… how is this possible if we all have the same learning style? And the argument is that we should stop wasting money trying to make education better? Really? I think I’ll disengage my gullible learning style and turn on my critical thinking style. …or does that not exist either?

' src=

You have to learn to read sheet music , just like reading a book. but it takes effort and once you learn it is good. learning by ear is more natural and maybe you will be more creative because music is audio. The Beatles could not read music. They seem to be saying it has not been determined if audio or visual leaning styles exist. Not whether one is better than the other and if we don’t know if they exist then why spend money behind them. You could invent many other plausible teaching methods and theories and spend a lot of money but maybe the best money spent is on things we know make a difference.

' src=

I dont what most people in here are even talking about. Scientific research? In the end it comes down to enjoyment.

' src=

The individual is as diverse from one another both in appearance and behaviours. It is not been proven that learning styles are debunked, only that on review by some eminent scientists, a shadow of doubt challenges the premise. Thus if we are diverse creatures it follows we will take in the world in diverse ways, some of us will have more developed auditory facilities, some hardwiring may mean visuals are easier – this is not a study but a fact. We as humans do everything differently than others, perhaps the universal categories should not be bandied about carelessly. But in education in particular, we certainly do take our world in in many and varied forms, construct how we see it and enact a life we see fit, all embedded in our social environment

' src=

I was encouraged that the psychologists got put in their place by those of us(teachers) who understand that all children learn differently. Why would you want to frustrate any child with visual learning material that leads to nothing but failure, when the same child can find success with teaching methods that match the child’s learning style?

Byron Thorne author of Toward A Failure-Proof Methodology for Learning To Read.

' src=

I love the article and the follow up debate. As a long time educator and student of education it is positive to see all the different perspectives. How to make things manageable for learners? Multi-modal presentations with options for showing one’s understanding and learning. Any teaching can be presented in various ways concurrently as long as we give the students what they need to have access to. My question would be more about how to best engage students so they would be engaged and self-motivated. Love the conversation.

' src=

I don’t agree or believe you! I am a visual and auditory learner. It works for me! I was a teacher and everyone has preferred learning styles. Some people do better with a snack. Some are tactile. Your study may be flawed but your conclusions are wrong.c Nance

' src=

They can say what they like, but I have seen very different foci in various individuals, with the same-system adherants failing miserably, more often than not, relative to more flexible instructors. I myself cannot grasp complex ideas without first having, or mentally generating, a visual reference.

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines .

Please login with your APS account to comment.

recent research on learning styles

Teaching: Successive Relearning / Thriving After Psychopathology

Lesson plans about successive relearning and finding happiness after being diagnosed with a mental disorder.

recent research on learning styles

Pursuing Best Practices in STEM Education: The Peril and Promise of Active Learning

Active learning is a promising yet loosely defined STEM instructional technique.

Natural Selection: The Mentoring Edition

In today’s society they may be hidden, but good shepherds do exist. They nurture. They guide. They use their foresight to keep their flock safe and ensure its survival. As graduate students, we often find

Privacy Overview

May 29, 2018

The Problem with "Learning Styles"

There is little scientific support for this fashionable idea—and stronger evidence for other learning strategies

By Cindi May

recent research on learning styles

Getty Images

When it comes to home projects, I am a step-by-step kind of girl. I read the instructions from start to finish, and then reread and execute each step. My husband, on the other hand, prefers to study the diagrams and then jump right in. Think owner’s manual versus IKEA instructions. This preference for one approach over another when learning new information is not uncommon. Indeed the notion that people learn in different ways is such a pervasive belief in American culture that there is a thriving industry dedicated to identifying learning styles and training teachers to meet the needs of different learners.

Just because a notion is popular, however, doesn’t make it true. A recent review of the scientific literature on learning styles found scant evidence to clearly support the idea that outcomes are best when instructional techniques align with individuals’ learning styles. In fact, there are several studies that contradict this belief. It is clear that people have a strong sense of their own learning preferences (e.g., visual, kinesthetic, intuitive), but it is less clear that these preferences matter.

Research by Polly Hussman and Valerie Dean O’Loughlin at Indiana University takes a new look at this important question. Most previous investigations on learning styles focused on classroom learning, and assessed whether instructional style impacted outcomes for different types of learners. But is the classroom really where most of the serious learning occurs? Some might argue that, in this era of flipped classrooms and online course materials, students master more of the information on their own. That might explain why instructional style in the classroom matters little. It also raises the possibility that learning styles do matter—perhaps a match between students’ individual learning styles and their study strategies is the key to optimal outcomes.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

To explore this possibility, Hussman and O’Loughlin asked students enrolled in an anatomy class to complete an online learning styles assessment and answer questions about their study strategies. More than 400 students completed the VARK (visual, auditory, reading/writing, kinesthetic) learning styles evaluation and reported details about the techniques they used for mastering material outside of class (e.g., flash cards, review of lecture notes, anatomy coloring books). Researchers also tracked their performance in both the lecture and lab components of the course.

Scores on the VARK suggested that most students used multiple learning styles (e.g., visual + kinesthetic or reading/writing + visual + auditory), but that no particular style (or combination of styles) resulted in better outcomes than another. The focus in this study, however, was not on whether a particular learning style was more advantageous. Instead, the research addressed two primary questions: First, do students who take the VARK questionnaire to identify their personal learning style adopt study strategies that align with that style? Second, are the learning outcomes better for students whose strategies match their VARK profile than for students whose strategies do not?

Despite knowing their own, self-reported learning preferences, nearly 70% of students failed to employ study techniques that supported those preferences. Most visual learners did not rely heavily on visual strategies (e.g., diagrams, graphics), nor did most reading/writing learners rely predominantly on reading strategies (e.g., review of notes or textbook), and so on. Given the prevailing belief that learning styles matter, and the fact many students blame poor academic performance on the lack of a match between their learning style and teachers’ instructional methods, one might expect students to rely on techniques that support their personal learning preferences when working on their own.

Perhaps the best students do. Nearly a third of the students in the study did choose strategies that were consistent with their reported learning style. Did that pay off? In a word, no. Students whose study strategies aligned with their VARK scores performed no better in either the lecture or lab component of the course.

So most students are not employing study strategies that mesh with self-reported learning preferences, and the minority who do show no academic benefit. Although students believe that learning preferences influence performance , this research affirms the mounting evidence that they do not, even when students are mastering information on their own. These findings suggest a general lack of student awareness about the processes and behaviors that support effective learning. Consistent with this notion, Hussman and O’Loughlin also found negative correlations between many of the common study strategies reported by students (e.g., making flashcards, use of outside websites) and course performance. Thus regardless of individual learning style or the alignment of the style with study techniques, many students are adopting strategies that simply do not support comprehension and retention of information.

Fortunately, cognitive science has identified a number of methods to enhance knowledge acquisition, and these techniques have fairly universal benefit. Students are more successful when they space out their study sessions over time, experience the material in multiple modalities , test themselves on the material as part of their study practices, and elaborate on material to make meaningful connections rather than engaging in activities that involve simple repetition of information (e.g., making flashcards or recopying notes). These effective strategies were identified decades ago and have convincing and significant empirical support. Why then, do we persist in our belief that learning styles matter, and ignore these tried and true techniques?

The popularity of the learning styles mythology may stem in part from the appeal of finding out what “type of person” you are, along with the desire to be treated as an individual within the education system. In contrast, the notion that universal strategies may enhance learning for all belies the idea that we are unique, individual learners. In addition, most empirically-supported techniques involve planning (e.g., scheduling study sessions over a series of days) and significant effort (e.g., taking practice tests in advance of a classroom assessment), and let’s face it, we don’t want to work that hard.

Cindi May is a professor of psychology at the College of Charleston. She explores avenues for improving cognitive function and outcomes in college students, older adults and individuals who are neurodiverse.

Center for Teaching

Learning styles, what are learning styles, why are they so popular.

The term  learning styles is widely used to describe how learners gather, sift through, interpret, organize, come to conclusions about, and “store” information for further use.  As spelled out in VARK (one of the most popular learning styles inventories), these styles are often categorized by sensory approaches:   v isual, a ural, verbal [ r eading/writing], and k inesthetic.  Many of the models that don’t resemble the VARK’s sensory focus are reminiscent of Felder and Silverman’s Index of Learning Styles , with a continuum of descriptors for how learners process and organize information:  active-reflective, sensing-intuitive, verbal-visual, and sequential-global.

There are well over 70 different learning styles schemes (Coffield, 2004), most of which are supported by “a thriving industry devoted to publishing learning-styles tests and guidebooks” and “professional development workshops for teachers and educators” (Pashler, et al., 2009, p. 105).

Despite the variation in categories, the fundamental idea behind learning styles is the same: that each of us has a specific learning style (sometimes called a “preference”), and we learn best when information is presented to us in this style.  For example, visual learners would learn any subject matter best if given graphically or through other kinds of visual images, kinesthetic learners would learn more effectively if they could involve bodily movements in the learning process, and so on.  The message thus given to instructors is that “optimal instruction requires diagnosing individuals’ learning style[s] and tailoring instruction accordingly” (Pashler, et al., 2009, p. 105).

Despite the popularity of learning styles and inventories such as the VARK, it’s important to know that there is no evidence to support the idea that matching activities to one’s learning style improves learning .  It’s not simply a matter of “the absence of evidence doesn’t mean the evidence of absence.”  On the contrary, for years researchers have tried to make this connection through hundreds of studies.

In 2009, Psychological Science in the Public Interest commissioned cognitive psychologists Harold Pashler, Mark McDaniel, Doug Rohrer, and Robert Bjork to evaluate the research on learning styles to determine whether there is credible evidence to support using learning styles in instruction.  They came to a startling but clear conclusion:  “Although the literature on learning styles is enormous,” they “found virtually no evidence” supporting the idea that “instruction is best provided in a format that matches the preference of the learner.”  Many of those studies suffered from weak research design, rendering them far from convincing.  Others with an effective experimental design “found results that flatly contradict the popular” assumptions about learning styles (p. 105). In sum,

“The contrast between the enormous popularity of the learning-styles approach within education and the lack of credible evidence for its utility is, in our opinion, striking and disturbing” (p. 117).

Pashler and his colleagues point to some reasons to explain why learning styles have gained—and kept—such traction, aside from the enormous industry that supports the concept.  First, people like to identify themselves and others by “type.” Such categories help order the social environment and offer quick ways of understanding each other.  Also, this approach appeals to the idea that learners should be recognized as “unique individuals”—or, more precisely, that differences among students should be acknowledged —rather than treated as a number in a crowd or a faceless class of students (p. 107). Carried further, teaching to different learning styles suggests that “ all people have the potential to learn effectively and easily if only instruction is tailored to their individual learning styles ” (p. 107).

There may be another reason why this approach to learning styles is so widely accepted. They very loosely resemble the concept of metacognition , or the process of thinking about one’s thinking.  For instance, having your students describe which study strategies and conditions for their last exam worked for them and which didn’t is likely to improve their studying on the next exam (Tanner, 2012).  Integrating such metacognitive activities into the classroom—unlike learning styles—is supported by a wealth of research (e.g., Askell Williams, Lawson, & Murray-Harvey, 2007; Bransford, Brown, & Cocking, 2000; Butler & Winne, 1995; Isaacson & Fujita, 2006; Nelson & Dunlosky, 1991; Tobias & Everson, 2002).

Importantly, metacognition is focused on planning, monitoring, and evaluating any kind of thinking about thinking and does nothing to connect one’s identity or abilities to any singular approach to knowledge.  (For more information about metacognition, see CFT Assistant Director Cynthia Brame’s “ Thinking about Metacognition ” blog post, and stay tuned for a Teaching Guide on metacognition this spring.)

There is, however, something you can take away from these different approaches to learning—not based on the learner, but instead on the content being learned .  To explore the persistence of the belief in learning styles, CFT Assistant Director Nancy Chick interviewed Dr. Bill Cerbin, Professor of Psychology and Director of the Center for Advancing Teaching and Learning at the University of Wisconsin-La Crosse and former Carnegie Scholar with the Carnegie Academy for the Scholarship of Teaching and Learning.  He points out that the differences identified by the labels “visual, auditory, kinesthetic, and reading/writing” are more appropriately connected to the nature of the discipline:

“There may be evidence that indicates that there are some ways to teach some subjects that are just better than others , despite the learning styles of individuals…. If you’re thinking about teaching sculpture, I’m not sure that long tracts of verbal descriptions of statues or of sculptures would be a particularly effective way for individuals to learn about works of art. Naturally, these are physical objects and you need to take a look at them, you might even need to handle them.” (Cerbin, 2011, 7:45-8:30 )

Pashler and his colleagues agree: “An obvious point is that the optimal instructional method is likely to vary across disciplines” (p. 116). In other words, it makes disciplinary sense to include kinesthetic activities in sculpture and anatomy courses, reading/writing activities in literature and history courses, visual activities in geography and engineering courses, and auditory activities in music, foreign language, and speech courses.  Obvious or not, it aligns teaching and learning with the contours of the subject matter, without limiting the potential abilities of the learners.

  • Askell-Williams, H., Lawson, M. & Murray, Harvey, R. (2007). ‘ What happens in my university classes that helps me to learn?’: Teacher education students’ instructional metacognitive knowledge. International Journal of the Scholarship of Teaching and Learning , 1. 1-21.
  • Bransford, J. D., Brown, A. L. & Cocking, R. R., (Eds.). (2000). How people learn: Brain, mind, experience, and school (Expanded Edition). Washington, D.C.: National Academy Press.
  • Butler, D. L., & Winne, P. H. (1995) Feedback and self-regulated learning: A theoretical synthesis . Review of Educational Research , 65, 245-281.
  • Cerbin, William. (2011). Understanding learning styles: A conversation with Dr. Bill Cerbin .  Interview with Nancy Chick. UW Colleges Virtual Teaching and Learning Center .
  • Coffield, F., Moseley, D., Hall, E., & Ecclestone, K. (2004). Learning styles and pedagogy in post-16 learning. A systematic and critical review . London: Learning and Skills Research Centre.
  • Isaacson, R. M. & Fujita, F. (2006). Metacognitive knowledge monitoring and self-regulated learning: Academic success and reflections on learning . Journal of the Scholarship of Teaching and Learning , 6, 39-55.
  • Nelson, T.O. & Dunlosky, J. (1991). The delayed-JOL effect: When delaying your judgments of learning can improve the accuracy of your metacognitive monitoring. Psychological Science , 2, 267-270.
  • Pashler, Harold, McDaniel, M., Rohrer, D., & Bjork, R.  (2008). Learning styles: Concepts and evidence . Psychological Science in the Public Interest . 9.3 103-119.
  • Tobias, S., & Everson, H. (2002). Knowing what you know and what you don’t: Further research on metacognitive knowledge monitoring . College Board Report No. 2002-3 . College Board, NY.

Creative Commons License

Teaching Guides

  • Online Course Development Resources
  • Principles & Frameworks
  • Pedagogies & Strategies
  • Reflecting & Assessing
  • Challenges & Opportunities
  • Populations & Contexts

Quick Links

  • Services for Departments and Schools
  • Examples of Online Instructional Modules
  • Research article
  • Open access
  • Published: 01 October 2021

Adaptive e-learning environment based on learning styles and its impact on development students' engagement

  • Hassan A. El-Sabagh   ORCID: orcid.org/0000-0001-5463-5982 1 , 2  

International Journal of Educational Technology in Higher Education volume  18 , Article number:  53 ( 2021 ) Cite this article

66k Accesses

68 Citations

27 Altmetric

Metrics details

Adaptive e-learning is viewed as stimulation to support learning and improve student engagement, so designing appropriate adaptive e-learning environments contributes to personalizing instruction to reinforce learning outcomes. The purpose of this paper is to design an adaptive e-learning environment based on students' learning styles and study the impact of the adaptive e-learning environment on students’ engagement. This research attempts as well to outline and compare the proposed adaptive e-learning environment with a conventional e-learning approach. The paper is based on mixed research methods that were used to study the impact as follows: Development method is used in designing the adaptive e-learning environment, a quasi-experimental research design for conducting the research experiment. The student engagement scale is used to measure the following affective and behavioral factors of engagement (skills, participation/interaction, performance, emotional). The results revealed that the experimental group is statistically significantly higher than those in the control group. These experimental results imply the potential of an adaptive e-learning environment to engage students towards learning. Several practical recommendations forward from this paper: how to design a base for adaptive e-learning based on the learning styles and their implementation; how to increase the impact of adaptive e-learning in education; how to raise cost efficiency of education. The proposed adaptive e-learning approach and the results can help e-learning institutes in designing and developing more customized and adaptive e-learning environments to reinforce student engagement.

Introduction

In recent years, educational technology has advanced at a rapid rate. Once learning experiences are customized, e-learning content becomes richer and more diverse (El-Sabagh & Hamed, 2020 ; Yang et al., 2013 ). E-learning produces constructive learning outcomes, as it allows students to actively participate in learning at anytime and anyplace (Chen et al., 2010 ; Lee et al., 2019 ). Recently, adaptive e-learning has become an approach that is widely implemented by higher education institutions. The adaptive e-learning environment (ALE) is an emerging research field that deals with the development approach to fulfill students' learning styles by adapting the learning environment within the learning management system "LMS" to change the concept of delivering e-content. Adaptive e-learning is a learning process in which the content is taught or adapted based on the responses of the students' learning styles or preferences. (Normadhi et al., 2019 ; Oxman & Wong, 2014 ). By offering customized content, adaptive e-learning environments improve the quality of online learning. The customized environment should be adaptable based on the needs and learning styles of each student in the same course. (Franzoni & Assar, 2009 ; Kolekar et al., 2017 ). Adaptive e-learning changes the level of instruction dynamically based on student learning styles and personalizes instruction to enhance or accelerate a student's success. Directing instruction to each student's strengths and content needs can minimize course dropout rates, increase student outcomes and the speed at which they are accomplished. The personalized learning approach focuses on providing an effective, customized, and efficient path of learning so that every student can participate in the learning process (Hussein & Al-Chalabi, 2020 ). Learning styles, on the other hand, represent an important issue in learning in the twenty-first century, with students expected to participate actively in developing self-understanding as well as their environment engagement. (Klasnja-Milicevic et al., 2011 ; Nuankaew et al., 2019 ; Truong, 2016 ).

In current conventional e-learning environments, instruction has traditionally followed a “one style fits all” approach, which means that all students are exposed to the same learning procedures. This type of learning does not take into account the different learning styles and preferences of students. Currently, the development of e-learning systems has accommodated and supported personalized learning, in which instruction is fitted to a students’ individual needs and learning styles (Beldagli & Adiguzel, 2010 ; Benhamdi et al., 2017 ; Pashler et al., 2008 ). Some personalized approaches let students choose content that matches their personality (Hussein & Al-Chalabi, 2020 ). The delivery of course materials is an important issue of personalized learning. Moreover, designing a well-designed, effective, adaptive e-learning system represents a challenge due to complication of adapting to the different needs of learners (Alshammari, 2016 ). Regardless of using e-learning claims that shifting to adaptive e-learning environments to be able to reinforce students' engagement. However, a learning environment cannot be considered adaptive if it is not flexible enough to accommodate students' learning styles. (Ennouamani & Mahani, 2017 ).

On the other hand, while student engagement has become a central issue in learning, it is also an indicator of educational quality and whether active learning occurs in classes. (Lee et al., 2019 ; Nkomo et al., 2021 ; Robinson & Hullinger, 2008 ). Veiga et al. ( 2014 ) suggest that there is a need for further research in engagement because assessing students’ engagement is a predictor of learning and academic progress. It is important to clarify the distinction between causal factors such as learning environment and outcome factors such as achievement. Accordingly, student engagement is an important research topic because it affects a student's final grade, and course dropout rate (Staikopoulos et al., 2015 ).

The Umm Al-Qura University strategic plan through common first-year deanship has focused on best practices that increase students' higher-order skills. These skills include communication skills, problem-solving skills, research skills, and creative thinking skills. Although the UQU action plan involves improving these skills through common first-year academic programs, the student's learning skills need to be encouraged and engaged more (Umm Al-Qura University Agency, 2020 ). As a result of the author's experience, The conventional methods of instruction in the "learning skills" course were observed, in which the content is presented to all students in one style that is dependent on understanding the content regardless of the diversity of their learning styles.

According to some studies (Alshammari & Qtaish, 2019 ; Lee & Kim, 2012 ; Shih et al., 2008 ; Verdú, et al., 2008 ; Yalcinalp & Avc, 2019 ), there is little attention paid to the needs and preferences of individual learners, and as a result, all learners are treated in the same way. More research into the impact of educational technologies on developing skills and performance among different learners is recommended. This “one-style-fits-all” approach implies that all learners are expected to use the same learning style as prescribed by the e-learning environment. Subsequently, a review of the literature revealed that an adaptive e-learning environment can affect learning outcomes to fill the identified gap. In conclusion: Adaptive e-learning environments rely on the learner's preferences and learning style as a reference that supports to create adaptation.

To confirm the above: the author conducted an exploratory study via an open interview that included some questions with a sample of 50 students in the learning skills department of common first-year. Questions asked about the difficulties they face when learning a "learning skills" course, what is the preferred way of course content. Students (88%) agreed that the way students are presented does not differ according to their differences and that they suffer from a lack of personal learning that is compatible with their style of work. Students (82%) agreed that they lack adaptive educational content that helps them to be engaged in the learning process. Accordingly, the author handled the research problem.

This research supplements to the existing body of knowledge on the subject. It is considered significant because it improves understanding challenges involved in designing the adaptive environments based on learning styles parameter. Subsequently, this paper is structured as follows: The next section presents the related work cited in the literature, followed by research methodology, then data collection, results, discussion, and finally, some conclusions and future trends are discussed.

Theoretical framework

This section briefly provides a thorough review of the literature about the adaptive E-learning environments based on learning styles.

Adaptive e-learning environments based on learning styles

The adaptive e-learning employment in higher education has been slower to evolve, and challenges that led to the slow implementation still exist. The learning management system offers the same tools to all learners, although individual learners need different details based on learning style and preferences. (Beldagli & Adiguzel, 2010 ; Kolekar et al., 2017 ). The interactive e-learning environment requisite evaluating the learner's desired learning style, before the course delivery, such as an online quiz or during the course delivery, such as tracking student reactions (DeCapua & Marshall, 2015 ).

In e-learning environments, adaptation is constructed on a series of well-designed processes to fit the instructional materials. The adaptive e-learning framework attempt to match instructional content to the learners' needs and styles. According to Qazdar et al. ( 2015 ), adaptive e-learning (AEL) environments rely on constructing a model of each learner's needs, preferences, and styles. It is well recognized that such adaptive behavior can increase learners' development and performance, thus enriching learning experience quality. (Shi et al., 2013 ). The following features of adaptive e-learning environments can be identified through diversity, interactivity, adaptability, feedback, performance, and predictability. Although adaptive framework taxonomy and characteristics related to various elements, adaptive learning includes at least three elements: a model of the structure of the content to be learned with detailed learning outcomes (a content model). The student's expertise based on success, as well as a method of interpreting student strengths (a learner model), and a method of matching the instructional materials and how it is delivered in a customized way (an instructional model) (Ali et al., 2019 ). The number of adaptive e-learning studies has increased over the last few years. Adaptive e-learning is likely to increase at an accelerating pace at all levels of instruction (Hussein & Al-Chalabi, 2020 ; Oxman & Wong, 2014 ).

Many studies assured the power of adaptive e-learning in delivering e-content for learners in a way that fitting their needs, and learning styles, which helps improve the process of students' acquisition of knowledge, experiences and develop their higher thinking skills (Ali et al., 2019 ; Behaz & Djoudi, 2012 ; Chun-Hui et al., 2017 ; Daines et al., 2016 ; Dominic et al., 2015 ; Mahnane et al., 2013 ; Vassileva, 2012 ). Student characteristics of learning style are recognized as an important issue and a vital influence in learning and are frequently used as a foundation to generate personalized learning experiences (Alshammari & Qtaish, 2019 ; El-Sabagh & Hamed, 2020 ; Hussein & Al-Chalabi, 2020 ; Klasnja-Milicevic et al., 2011 ; Normadhi et al., 2019 ; Ozyurt & Ozyurt, 2015 ).

The learning style is a parameter of designing adaptive e-learning environments. Individuals differ in their learning styles when interacting with the content presented to them, as many studies emphasized the relationship between e-learning and learning styles to be motivated in learning situations, consequently improving the learning outcomes (Ali et al., 2019 ; Alshammari, 2016 ; Alzain et al., 2018a , b ; Liang, 2012 ; Mahnane et al., 2013 ; Nainie et al., 2010 ; Velázquez & Assar, 2009 ). The word "learning style" refers to the process by which the learner organizes, processes, represents, and combines this information and stores it in his cognitive source, then retrieves the information and experiences in the style that reflects his technique of communicating them. (Fleming & Baume, 2006 ; Jaleel & Thomas, 2019 ; Jonassen & Grabowski, 2012 ; Klasnja-Milicevic et al., 2011 ; Nuankaew et al., 2019 ; Pashler et al., 2008 ; Willingham et al., 2105 ; Zhang, 2017 ). The concept of learning style is founded based on the fact that students vary in their styles of receiving knowledge and thought, to help them recognizing and combining information in their mind, as well as acquire experiences and skills. (Naqeeb, 2011 ). The extensive scholarly literature on learning styles is distributed with few strong experimental findings (Truong, 2016 ), and a few findings on the effect of adapting instruction to learning style. There are many models of learning styles (Aldosarim et al., 2018 ; Alzain et al., 2018a , 2018b ; Cletus & Eneluwe, 2020 ; Franzoni & Assar, 2009 ; Willingham et al., 2015 ), including the VARK model, which is one of the most well-known models used to classify learning styles. The VARK questionnaire offers better thought about information processing preferences (Johnson, 2009 ). Fleming and Baume ( 2006 ) developed the VARK model, which consists of four students' preferred learning types. The letter "V" represents for visual and means the visual style, while the letter "A" represents for auditory and means the auditory style, and the letter "R/W" represents "write/read", means the reading/writing style, and the letter "K" represents the word "Kinesthetic" and means the practical style. Moreover, VARK distinguishes the visual category further into graphical and textual or visual and read/write learners (Murphy et al., 2004 ; Leung, et al., 2014 ; Willingham et al., 2015 ). The four categories of The VARK Learning Style Inventory are shown in the Fig. 1 below.

figure 1

VARK learning styles

According to the VARK model, learners are classified into four groups representing basic learning styles based on their responses which have 16 questions, there are four potential responses to each question, where each answer agrees to one of the extremes of the dimension (Hussain, 2017 ; Silva, 2020 ; Zhang, 2017 ) to support instructors who use it to create effective courses for students. Visual learners prefer to take instructional materials and send assignments using tools such as maps, graphs, images, and other symbols, according to Fleming and Baume ( 2006 ). Learners who can read–write prefer to use written textual learning materials, they use glossaries, handouts, textbooks, and lecture notes. Aural learners, on the other hand, prefer to learn through spoken materials, dialogue, lectures, and discussions. Direct practice and learning by doing are preferred by kinesthetic learners (Becker et al., 2007 ; Fleming & Baume, 2006 ; Willingham et al., 2015 ). As a result, this research work aims to provide a comprehensive discussion about how these individual parameters can be applied in adaptive e-learning environment practices. Dominic et al., ( 2015 ) presented a framework for an adaptive educational system that personalized learning content based on student learning styles (Felder-Silverman learning model) and other factors such as learners' learning subject competency level. This framework allowed students to follow their adaptive learning content paths based on filling in "ils" questionnaire. Additionally, providing a customized framework that can automatically respond to students' learning styles and suggest online activities with complete personalization. Similarly, El Bachari et al. ( 2011 ) attempted to determine a student's unique learning style and then adapt instruction to that individual interests. Adaptive e-learning focused on learner experience and learning style has a higher degree of perceived usability than a non-adaptive e-learning system, according to Alshammari et al. ( 2015 ). This can also improve learners' satisfaction, engagement, and motivation, thus improving their learning.

According to the findings of (Akbulut & Cardak, 2012 ; Alshammari & Qtaish, 2019 ; Alzain et al., 2018a , b ; Shi et al., 2013 ; Truong, 2016 ), adaptation based on a combination of learning style, and information level yields significantly better learning gains. Researchers have recently initiated to focus on how to personalize e-learning experiences using personal characteristics such as the student's preferred learning style. Personal learning challenges are addressed by adaptive learning programs, which provide learners with courses that are fit to their specific needs, such as their learning styles.

  • Student engagement

Previous research has emphasized that student participation is a key factor in overcoming academic problems such as poor academic performance, isolation, and high dropout rates (Fredricks et al., 2004 ). Student participation is vital to student learning, especially in an online environment where students may feel isolated and disconnected (Dixson, 2015 ). Student engagement is the degree to which students consciously engage with a course's materials, other students, and the instructor. Student engagement is significant for keeping students engaged in the course and, as a result, in their learning (Barkley & Major, 2020 ; Lee et al., 2019 ; Rogers-Stacy, et al, 2017 ). Extensive research was conducted to investigate the degree of student engagement in web-based learning systems and traditional education systems. For instance, using a variety of methods and input features to test the relationship between student data and student participation (Hussain et al., 2018 ). Guo et al. ( 2014 ) checked the participation of students when they watched videos. The input characteristics of the study were based on the time they watched it and how often students respond to the assessment.

Atherton et al. ( 2017 ) found a correlation between the use of course materials and student performance; course content is more expected to lead to better grades. Pardo et al., ( 2016 ) found that interactive students with interactive learning activities have a significant impact on student test scores. The course results are positively correlated with student participation according to previous research. For example, Atherton et al. ( 2017 ) explained that students accessed learning materials online and passed exams regularly to obtain higher test scores. Other studies have shown that students with higher levels of participation in questionnaires and course performance tend to perform well (Mutahi et al., 2017 ).

Skills, emotion, participation, and performance, according to Dixson ( 2015 ), were factors in online learning engagement. Skills are a type of learning that includes things like practicing on a daily foundation, paying attention while listening and reading, and taking notes. Emotion refers to how the learner feels about learning, such as how much you want to learn. Participation refers to how the learner act in a class, such as chat, discussion, or conversation. Performance is a result, such as a good grade or a good test score. In general, engagement indicated that students spend time, energy learning materials, and skills to interact constructively with others in the classroom, and at least participate in emotional learning in one way or another (that is, be motivated by an idea, willing to learn and interact). Student engagement is produced through personal attitudes, thoughts, behaviors, and communication with others. Thoughts, effort, and feelings to a certain level when studying. Therefore, the student engagement scale attempts to measure what students are doing (thinking actively), how they relate to their learning, and how they relate to content, faculty members, and other learners including the following factors as shown in Fig.  2 . (skills, participation/interaction, performance, and emotions). Hence, previous research has moved beyond comparing online and face-to-face classes to investigating ways to improve online learning (Dixson, 2015 ; Gaytan & McEwen, 2007 ; Lévy & Wakabayashi, 2008 ; Mutahi et al., 2017 ). Learning effort, involvement in activities, interaction, and learning satisfaction, according to reviews of previous research on student engagement, are significant measures of student engagement in learning environments (Dixson, 2015 ; Evans et al., 2017 ; Lee et al., 2019 ; Mutahi et al., 2017 ; Rogers-Stacy et al., 2017 ). These results point to several features of e-learning environments that can be used as measures of student participation. Successful and engaged online learners learn actively, have the psychological inspiration to learn, make good use of prior experience, and make successful use of online technology. Furthermore, they have excellent communication abilities and are adept at both cooperative and self-directed learning (Dixson, 2015 ; Hong, 2009 ; Nkomo et al., 2021 ).

figure 2

Engagement factors

Overview of designing the adaptive e-learning environment

The paper follows the (ADDIE) Instructional Design Model: analysis, design, develop, implement, and evaluate to answer the first research question. The adaptive learning environment offers an interactive decentralized media environment that takes into account individual differences among students. Moreover, the environment can spread the culture of self-learning, attract students, and increase their engagement in learning.

Any learning environment that is intended to accomplish a specific goal should be consistent to increase students' motivation to learn. so that they have content that is personalized to their specific requirements, rather than one-size-fits-all content. As a result, a set of instructional design standards for designing an adaptive e-learning framework based on learning styles was developed according to the following diagram (Fig. 3 ).

figure 3

The ID (model) of the adaptive e-learning environment

According to the previous figure, The analysis phase included identifying the course materials and learning tools (syllabus and course plan modules) used for the study. The learning objectives were included in the high-level learning objectives (C4-C6: analysis, synthesis, evaluation).

The design phase included writing SMART objectives, the learning materials were written within the modules plan. To support adaptive learning, four content paths were identified, choosing learning models, processes, and evaluation. Course structure and navigation were planned. The adaptive structural design identified the relationships between the different components, such as introduction units, learning materials, quizzes. Determining the four path materials. The course instructional materials were identified according to the following Figure 4 .

figure 4

Adaptive e-course design

The development phase included: preparing and selecting the media for the e-course according to each content path in an adaptive e-learning environment. During this process, the author accomplished the storyboard and the media to be included on each page of the storyboard. A category was developed for the instructional media for each path (Fig. 5 )

figure 5

Roles and deployment diagram of the adaptive e-learning environment

The author developed a learning styles questionnaire via a mobile App. as follows: https://play.google.com/store/apps/details?id=com.pointability.vark . Then, the students accessed the adaptive e-course modules based on their learning styles.

The Implementation phase involved the following: The professional validation of the course instructional materials. Expert validation is used to evaluate the consistency of course materials (syllabi and modules). The validation was performed including the following: student learning activities, learning implementation capability, and student reactions to modules. The learner's behaviors, errors, navigation, and learning process are continuously geared toward improving the learner's modules based on the data the learner gathered about him.

The Evaluation phase included five e-learning specialists who reviewed the adaptive e-learning. After that, the framework was revised based on expert recommendations and feedback. Content assessment, media evaluation in three forms, instructional design, interface design, and usage design included in the evaluation. Adaptive learners checked the proposed framework. It was divided into two sections. Pilot testing where the proposed environment was tested by ten learners who represented the sample in the first phase. Each learner's behavior was observed, questions were answered, and learning control, media access, and time spent learning were all verified.

Research methodology

Research purpose and questions.

This research aims to investigate the impact of designing an adaptive e-learning environment on the development of students' engagement. The research conceptual framework is illustrated in Fig.  6 . Therefore, the articulated research questions are as follows: the main research question is "What is the impact of an adaptive e-learning environment based on (VARK) learning styles on developing students' engagement? Accordingly, there are two sub research questions a) "What is the instructional design of the adaptive e-learning environment?" b) "What is the impact of an adaptive e-learning based on (VARK) learning styles on development students' engagement (skills, participation, performance, emotional) in comparison with conventional e-learning?".

figure 6

The conceptual framework (model) of the research questions

Research hypotheses

The research aims to verify the validity of the following hypothesis:

There is no statistically significant difference between the students' mean scores of the experimental group that exposed to the adaptive e-learning environment and the scores of the control group that was exposed to the conventional e-learning environment in pre-application of students' engagement scale.

There is a statistically significant difference at the level of (0.05) between the students' mean scores of the experimental group (adaptive e-learning) and the scores of the control group (conventional e-learning) in post-application of students' engagement factors in favor of the experimental group.

Research design

This research was a quasi-experimental research with the pretest-posttest. Research variables were independent and dependent as shown in the following Fig. 7 .

figure 7

Research "Experimental" design

Both groups were informed with the learning activities tracks, the experimental group was instructed to use the adaptive learning environment to accomplish the learning goals; on the other hand, the control group was exposed to the conventional e-learning environment without the adaptive e-learning parameters.

Research participants

The sample consisted of students studying the "learning skills" course in the common first-year deanship aged between (17–18) years represented the population of the study. All participants were chosen in the academic year 2109–2020 at the first term which was taught by the same instructors. The research sample included two classes (118 students), selected randomly from the learning skills department. First-group was randomly assigned as the control group (N = 58, 31 males and 27 females), the other was assigned as experimental group (N = 60, 36 males and 24 females) was assigned to the other class. The following Table 1 shows the distribution of students' sample "Demographics data".

The instructional materials were not presented to the students before. The control group was expected to attend the conventional e-learning class, where they were provided with the learning environment without adaptive e-learning parameter based on the learning styles that introduced the "learning skills" course. The experimental group was exposed to the use of adaptive e-learning based on learning styles to learn the same course instructional materials within e-course. Moreover, all the student participants were required to read the guidelines to indicate their readiness to participate in the research experiment with permission.

Research instruments

In this research, the measuring tools included the VARK questionnaire and the students' engagement scale including the following factors (skills, participation/interaction, performance, emotional). To begin, the pre-post scale was designed to assess the level of student engagement related to the "learning skills" course before and after participating in the experiment.

VARK questionnaire

Questionnaires are a common method for collecting data in education research (McMillan & Schumacher, 2006 ). The VARK questionnaire had been organized electronically and distributed to the student through the developed mobile app and registered on the UQU system. The questionnaire consisted of 16 items within the scale as MCQ classified into four main factors (kinesthetic, auditory, visual, and R/W).

Reliability and Validity of The VARK questionnaire

For reliability analysis, Cronbach’s alpha is used for evaluating research internal consistency. Internal consistency was calculated through the calculation of correlation of each item with the factor to which it fits and correlation among other factors. The value of 0.70 and above are normally recognized as high-reliability values (Hinton et al., 2014 ). The Cronbach's Alpha correlation coefficient for the VARK questionnaire was 0.83, indicating that the questionnaire was accurate and suitable for further research.

Students' engagement scale

The engagement scale was developed after a review of the literature on the topic of student engagement. The Dixson scale was used to measure student engagement. The scale consisted of 4 major factors as follows (skills, participation/interaction, performance, emotional). The author adapted the original "Dixson scale" according to the following steps. The Dixson scale consisted of 48 statements was translated and accommodated into Arabic by the author. After consulting with experts, the instrument items were reduced to 27 items after adaptation according to the university learning environment. The scale is rated on a 5-point scale.

The final version of the engagement scale comprised 4 factors as follows: The skills engagement included (ten items) to determine keeping up with, reading instructional materials, and exerting effort. Participation/interaction engagement involved (five items) to measure having fun, as well as regularly engaging in group discussion. The performance engagement included (five items) to measure test performance and receiving a successful score. The emotional engagement involved (seven items) to decide whether or not the course was interesting. Students can access to respond engagement scale from the following link: http://bit.ly/2PXGvvD . Consequently, the objective of the scale is to measure the possession of common first-year students of the basic engagement factors before and after instruction with adaptive e-learning compared to conventional e-learning.

Reliability and validity of the engagement scale

The alpha coefficient of the scale factors scores was presented. All four subscales have a strong degree of internal accuracy (0.80–0.87), indicating strong reliability. The overall reliability of the instruments used in this study was calculated using Alfa-alpha, Cronbach's with an alpha value of 0.81 meaning that the instruments were accurate. The instruments used in this research demonstrated strong validity and reliability, allowing for an accurate assessment of students' engagement in learning. The scale was applied to a pilot sample of 20 students, not including the experimental sample. The instrument, on the other hand, had a correlation coefficient of (0.74–0.82), indicating a degree of validity that enables the instrument's use. Table 2 shows the correlation coefficient and Cronbach's alpha based on the interaction scale.

On the other hand, to verify the content validity; the scale was to specialists to take their views on the clarity of the linguistic formulation and its suitability to measure students' engagement, and to suggest what they deem appropriate in terms of modifications.

Research procedures

To calculate the homogeneity and group equivalence between both groups, the validity of the first hypothesis was examined which stated "There is no statistically significant difference between the students' mean scores of the experimental group that exposed to the adaptive e-learning environment and the scores of the control group that was exposed to the conventional e-learning environment in pre-application of students' engagement scale", the author applied the engagement scale to both groups beforehand, and the scores of the pre-application were examined to verify the equivalence of the two groups (experimental and control) in terms of students' engagement.

The t-test of independent samples was calculated for the engagement scale to confirm the homogeneity of the two classes before the experiment. The t-values were not significant at the level of significance = 0.05, meaning that the two groups were homogeneous in terms of students' engagement scale before the experiment.

Since there was no significant difference in the mean scores of both groups ( p  > 0.05), the findings presented in Table 3 showed that there was no significant difference between both experimental and control groups in engagement as a whole, and each student engagement factor separately. The findings showed that the two classes were similar before start of research experiment.

Learner content path in adaptive e-learning environment

The previous well-designed processes are the foundation for adaptation in e-learning environments. There are identified entries for accommodating materials, including classification depending on learning style.: kinesthetic, auditory, visual, and R/W. The present study covered the 1st semester during the 2019/2020 academic year. The course was divided into modules that concentrated on various topics; eleven of the modules included the adaptive learning exercise. The exercises and quizzes were assigned to specific textbook modules. To reduce irrelevant variation, all objects of the course covered the same content, had equal learning results, and were taught by the same instructor.

The experimental group—in which students were asked to bring smartphones—was taught, where the how-to adaptive learning application for adaptive learning was downloaded, and a special account was created for each student, followed by access to the channel designed by the through the application, and the students were provided with instructions and training on how entering application with the appropriate default element of the developed learning objects, while the control group used the variety of instructional materials in the same course for the students.

In this adaptive e-course, students in the experimental group are presented with a questionnaire asked to answer that questions via a developed mobile App. They are provided with four choices. Students are allowed to answer the questions. The correct answer is shown in the students' responses to the results, but the learning module is marked as incomplete. If a student chooses to respond to a question, the correct answer is found immediately, regardless of the student's reaction.

Figure  8 illustrates a visual example from learning styles identification through responding VARK Questionnaire. The learning process experienced by the students in this adaptive Learning environment is as shown in Fig.  4 . Students opened the adaptive course link by tapping the following app " https://play.google.com/store/apps/details?id=com.pointability.vark ," which displayed the appropriate positioning of both the learning skills course and the current status of students. It directed students to the learning skills that they are interested in learning more. Once students reached a specific situation in the e-learning environment, they could access relevant digital instructional materials. Students were then able to progress through the various styles offered by the proposed method, giving them greater flexibility in their learning pace.

figure 8

Visual example from "learning of the learning styles" identification and adaptive e-learning course process

The "flowchart" diagram below illustrates the learner's path in an adaptive e-learning environment, depending on the (VARK) learning styles (visual, auditory, kinesthetic, reading/writing) (Fig. 9 ).

figure 9

Student learning path

According to the previous design model of the adaptive framework, the students responded "Learning Styles" questionnaire. Based on each student's results, the orientation of students will direct to each of "Visual", "Aural", "Read-Write", and "Kinesthetic". The student took at the beginning the engagement scale online according to their own pace. When ready, they responded "engagement scale".

Based on the results, the system produced an individualized learning plan to fill in the gap based on the VARK questionnaire's first results. The learner model represents important learner characteristics such as personal information, knowledge level, and learning preferences. Pre and post measurements were performed for both experimental and control groups. The experimental group was exposed only to treatment (using the adaptive learning environment).

To address the second question, which states: “What is the impact "effect" of adaptive e-learning based on (VARK) learning styles on development students' engagement (skills, participation/interaction, performance, emotional) in comparison with conventional e-learning?

The validity of the second hypothesis of the research hypothesis was tested, which states " There is a statistically significant difference at the level of (0.05) between the students' mean scores of the experimental group (adaptive e-learning) and the scores of the control group (conventional e-learning) in post-application of students' engagement factors in favor of the experimental group". To test the hypothesis, the arithmetic means, standard deviations, and "T"-test values were calculated for the results of the two research groups in the application of engagement scale factors".

Table 4 . indicates that students in the experimental group had significantly higher mean of engagement post-test (engagement factors items) scores than students in the control group ( p  < 0.05).

The experimental research was performed to evaluate the impact of the proposed adaptive e-learning. Independent sample t-tests were used to measure the previous behavioral engagement of the two groups related to topic of this research. Subsequently, the findings stated that the experimental group students had higher learning achievement than those who were taught using the conventional e-learning approach.

To verify the effect size of the independent variable in terms of the dependent variable, Cohen (d) was used to investigate that adaptive learning can significantly students' engagement. According to Cohen ( 1992 ), ES of 0.20 is small, 0.50 is medium, and 0.80 is high. In the post-test of the student engagement scale, however, the effect size between students' scores in the experimental and control groups was calculated using (d and r) using means and standard deviations. Cohen's d = 0.826, and Effect-size r = 0.401, according to the findings. The ES of 0.824 means that the treated group's mean is in the 79th percentile of the control group (Large effect). Effect sizes can also be described as the average percentile rank of the average treated learner compared to the average untreated learner in general. The mean of the treated group is at the 50th percentile of the untreated group, indicating an ES of 0.0. The mean of the treated group is at the 79th percentile of the untreated group, with an ES of 0.8. The results showed that the dependent variable was strongly influenced in the four behavioral engagement factors: skills: performance, participation/interaction, and emotional, based on the fact that effect size is a significant factor in determining the research's strength.

Discussions and limitations

This section discusses the impact of an adaptive e-learning environment on student engagement development. This paper aimed to design an adaptive e-learning environment based on learning style parameters. The findings revealed that factors correlated to student engagement in e-learning: skills, participation/interaction, performance, and emotional. The engagement factors are significant because they affect learning outcomes (Nkomo et al., 2021 ). Every factor's items correlate to cognitive process-related activities. The participation/interaction factor, for example, referred to, interactions with the content, peers, and instructors. As a result, student engagement in e-learning can be predicted by interactions with content, peers, and instructors. The results are in line with previous research, which found that customized learning materials are important for increasing students' engagement. Adaptive e-learning based on learning styles sets a strong emphasis on behavioral engagement, in which students manage their learning while actively participating in online classes to adapt instruction according to each learning style. This leads to improved learning outcomes (Al-Chalabi & Hussein, 2020 ; Chun-Hui et al., 2017 ; Hussein & Al-Chalabi, 2020 ; Pashler et al., 2008 ). The experimental findings of this research showed that students who learned through adaptive eLearning based on learning styles learned more; as learning styles are reflected in this research as one of the generally assumed concerns as a reference for adapting e-content path. Students in the experimental group reported that the adaptive eLearning environment was very interesting and able to attract their attention. Those students also indicated that the adaptive eLearning environment was particularly useful because it provided opportunities for them to recall the learning content, thus enhancing their overall learning impression. This may explain why students in the experimental group performed well in class and showed more enthusiasm than students in the control group. This research compared an adaptive e-learning environment to a conventional e-learning approach toward engagement in a learning skills course through instructional content delivery and assessment. It can also be noticed that the experimental group had higher participation than the control group, indicating that BB activities were better adapted to the students' learning styles. Previous studies have agreed on the effectiveness of adaptive learning; it provides students with quality opportunity that is adapted to their learning styles, and preferences (Alshammari, 2016 ; Hussein & Al-Chalabi, 2020 ; Roy & Roy, 2011 ; Surjono, 2014 ). However, it should be noted that this study is restricted to one aspect of content adaptation and its factors, which is learning materials adapting based on learning styles. Other considerations include content-dependent adaptation. These findings are consistent with other studies, such as (Alshammari & Qtaish, 2019 ; Chun-Hui et al., 2017 ), which have revealed the effectiveness of the adaptive e-learning environment. This research differs from others in that it reflects on the Umm Al-Qura University as a case study, VARK Learning styles selection, engagement factors, and the closed learning management framework (BB).

The findings of the study revealed that adaptive content has a positive impact on adaptive individuals' achievement and student engagement, based on their learning styles (kinesthetic; auditory; visual; read/write). Several factors have contributed to this: The design of adaptive e-content for learning skills depended on introducing an ideal learning environment for learners, and providing support for learning adaptation according to the learning style, encouraging them to learn directly, achieving knowledge building, and be enjoyable in the learning process. Ali et al. ( 2019 ) confirmed that, indicating that education is adapted according to each individual's learning style, needs, and characteristics. Adaptive e-content design that allows different learners to think about knowledge by presenting information and skills in a logical sequence based on the adaptive e-learning framework, taking into account its capabilities as well as the diversity of its sources across the web, and these are consistent with the findings of (Alshammari & Qtaish, 2019 ).

Accordingly, the previous results are due to the following: good design of the adaptive e-learning environment in light of the learning style and educational preferences according to its instructional design (ID) standards, and the provision of adaptive content that suits the learners' needs, characteristics, and learning style, in addition to the diversity of course content elements (texts, static images, animations, and video), variety of tests and activities, diversity of methods of reinforcement, return and support from the instructor and peers according to the learning style, as well as it allows ease of use, contains multiple and varied learning sources, and allows referring to the same point when leaving the environment.

Several studies have shown that using adaptive eLearning technologies allows students to improve their learning knowledge and further enhance their engagement in issues such as "skills, performance, interaction, and emotional" (Ali et al., 2019 ; Graf & Kinshuk, 2007 ; Murray & Pérez, 2015 ); nevertheless, Murray and Pérez ( 2015 ) revealed that adaptive learning environments have a limited impact on learning outcome.

The restricted empirical findings on the efficacy of adapting teaching to learning style are mixed. (Chun-Hui et al., 2017 ) demonstrated that adaptive eLearning technologies can be beneficial to students' learning and development. According to these findings, adaptive eLearning can be considered a valuable method for learning because it can attract students' attention and promote their participation in educational activities. (Ali et al., 2019 ); however, only a few recent studies have focused on how adaptive eLearning based on learning styles fits in diverse cultural programs. (Benhamdi et al., 2017 ; Pashler et al., 2008 ).

The experimental results revealed that the proposed environment significantly increased students' learning achievements as compared to the conventional e-learning classroom (without adaptive technology). This means that the proposed environment's adaptation could increase students' engagement in the learning process. There is also evidence that an adaptive environment positively impacts other aspects of quality such as student engagement (Murray & Pérez, 2015 ).

Conclusions and implications

Although this field of research has stimulated many interests in recent years, there are still some unanswered questions. Some research gaps are established and filled in this study by developing an active adaptive e-learning environment that has been shown to increase student engagement. This study aimed to design an adaptive e-learning environment for performing interactive learning activities in a learning skills course. The main findings of this study revealed a significant difference in learning outcomes as well as positive results for adaptive e-learning students, indicating that it may be a helpful learning method for higher education. It also contributed to the current adaptive e-learning literature. The findings revealed that adaptive e-learning based on learning styles could help students stay engaged. Consequently, adaptive e-learning based on learning styles increased student engagement significantly. According to research, each student's learning style is unique, and they prefer to use different types of instructional materials and activities. Furthermore, students' preferences have an impact on the effectiveness of learning. As a result, the most effective learning environment should adjust its output to the needs of the students. The development of high-quality instructional materials and activities that are adapted to students' learning styles will help them participate and be more motivated. In conclusion, learning styles are a good starting point for creating instructional materials based on learning theories.

This study's results have important educational implications for future studies on the effect of adaptive e-learning on student interaction. First, the findings may provide data to support the development and improvement of adaptive environments used in blended learning. Second, the results emphasize the need for more quasi-experimental and descriptive research to better understand the benefits and challenges of incorporating adaptive e-learning in higher education institutions. Third, the results of this study indicate that using an adaptive model in an adaptive e-learning environment will encourage, motivate, engage, and activate students' active learning, as well as facilitate their knowledge construction, rather than simply taking in information passively. Fourth, new research is needed to design effective environments in which adaptive learning can be used in higher education institutions to increase academic performance and motivation in the learning process. Finally, the study shows that adaptive e-learning allows students to learn individually, which improves their learning and knowledge of course content, such as increasing their knowledge of learning skills course topics beyond what they can learn in a conventional e-learning classroom.

Contribution to research

The study is intended to provide empirical evidence of adaptive e-learning on student engagement factors. This research, on the other hand, has practical implications for higher education stakeholders, as it is intended to provide university faculty members with learning approaches that will improve student engagement. It is also expected to offer faculty a framework for designing personalized learning environments based on learning styles in various learning situations and designing more adaptive e-learning environments.

Research implication

Students with their preferred learning styles are more likely to enjoy learning if they are provided with a variety of instructional materials such as references, interactive media, videos, podcasts, storytelling, simulation, animation, problem-solving, games, and accessible educational tools in an e-learning environment. Also, different learning strategies can be accommodated. Other researchers would be able to conduct future studies on the use of the "adaptive e-learning" approach throughout the instructional process, at different phases of learning, and in various e-courses as a result of the current study. Meanwhile, the proposed environment's positive impact on student engagement gained considerable interest for future educational applications. Further research on learning styles in different university colleges could contribute to a foundation for designing adaptive e-courses based on students' learning styles and directing more future research on learning styles.

Implications for practice or policy:

Adaptive e-learning focused on learning styles would help students become more engaged.

Proving the efficacy of an adaptive e-learning environment via comparison with conventional e-learning .

Availability of data and materials

The author confirms that the data supporting the findings of this study are based on the research tools which were prepared and explained by the author and available on the links stated in the research instruments sub-section. The data analysis that supports the findings of this study is available on request from the corresponding author.

Akbulut, Y., & Cardak, C. (2012). Adaptive educational hypermedia accommodating learning styles: A content analysis of publications from 2000 to 2011. Computers & Education . https://doi.org/10.1016/j.compedu.2011.10.008 .

Article   Google Scholar  

Al-Chalabi, H., & Hussein, A. (2020). Analysis & implementation of personalization parameters in the development of computer-based adaptive learning environment. SAR Journal Science and Research., 3 (1), 3–9. https://doi.org/10.18421//SAR31-01 .

Aldosari, M., Aljabaa, A., Al-Sehaibany, F., & Albarakati, S. (2018). Learning style preferences of dental students at a single institution in Riyadh Saudi Arabia, evaluated using the VARK questionnaire . Advances in Medical Education and Practice. https://doi.org/10.2147/AMEP.S157686 .

Ali, N., Eassa, F., & Hamed, E. (2019). Personalized Learning Style for Adaptive E-Learning System, International Journal of Advanced Trends in Computer Science and Engineering . 223-230. Retrieved June 26, 2020 from http://www.warse.org/IJATCSE/static/pdf/file/ijatcse4181.12019.pdf .

Alshammari, M., & Qtaish, A. (2019). Effective adaptive e-learning systems according to learning style and knowledge level. JITE Research, 18 , 529–547. https://doi.org/10.28945/4459 .

Alshammari, M. (2016). Adaptation based on learning style and knowledge level in e-learning systems, Ph.D. thesis , University of Birmingham.  Retrieved April 18, 2019 from http://etheses.bham.ac.uk//id/eprint/6702/ .

Alshammari, M., Anane, R., & Hendley, R. (2015). Design and Usability Evaluation of Adaptive E-learning Systems based on Learner Knowledge and Learning Style. Human-Computer Interaction Conference- INTERACT , Vol. (9297), (pp. 157–186). https://doi.org/10.1007/978-3-319-22668-2_45 .

Alzain, A., Clack, S., Jwaid, A., & Ireson, G. (2018a). Adaptive education based on learning styles: Are learning style instruments precise enough. International Journal of Emerging Technologies in Learning (iJET), 13 (9), 41–52. https://doi.org/10.3991/ijet.v13i09.8554 .

Alzain, A., Clark, S., Ireson, G., & Jwaid, A. (2018b). Learning personalization based on learning style instruments. Advances in Science Technology and Engineering Systems Journal . https://doi.org/10.25046/aj030315 .

Atherton, M., Shah, M., Vazquez, J., Griffiths, Z., Jackson, B., & Burgess, C. (2017). Using learning analytics to assess student engagement and academic outcomes in open access enabling programs”. Journal of Open, Distance and e-Learning, 32 (2), 119–136.

Barkley, E., & Major, C. (2020). Student engagement techniques: A handbook for college faculty . Jossey-Bass . 10:047028191X.

Google Scholar  

Becker, K., Kehoe, J., & Tennent, B. (2007). Impact of personalized learning styles on online delivery and assessment. Campus-Wide Information Systems . https://doi.org/10.1108/10650740710742718 .

Behaz, A., & Djoudi, M. (2012). Adaptation of learning resources based on the MBTI theory of psychological types. IJCSI International Journal of Computer Science, 9 (2), 135–141.

Beldagli, B., & Adiguzel, T. (2010). Illustrating an ideal adaptive e-learning: A conceptual framework. Procedia - Social and Behavioral Sciences, 2 , 5755–5761. https://doi.org/10.1016/j.sbspro.2010.03.939 .

Benhamdi, S., Babouri, A., & Chiky, R. (2017). Personalized recommender system for e-Learning environment. Education and Information Technologies, 22 , 1455–1477. https://doi.org/10.1007/s10639-016-9504-y .

Chen, P., Lambert, A., & Guidry, K. (2010). Engaging online learners: The impact of Web-based learning technology on college student engagement. Computers & Education, 54 , 1222–1232.

Chun-Hui, Wu., Chen, Y.-S., & Chen, T. C. (2017). An adaptive e-learning system for enhancing learning performance: based on dynamic scaffolding theory. Eurasia Journal of Mathematics, Science and Technology Education. https://doi.org/10.12973/ejmste/81061 .

Cletus, D., & Eneluwe, D. (2020). The impact of learning style on student performance: mediate by personality. International Journal of Education, Learning and Training. https://doi.org/10.24924/ijelt/2019.11/v4.iss2/22.47Desmond .

Cohen, J. (1992). Statistical power analysis. Current Directions in Psychological Science., 1 (3), 98–101. https://doi.org/10.1111/1467-8721.ep10768783 .

Daines, J., Troka, T. and Santiago, J. (2016). Improving performance in trigonometry and pre-calculus by incorporating adaptive learning technology into blended models on campus. https://doi.org/10.18260/p.25624 .

DeCapua, A. & Marshall, H. (2015). Implementing a Mutually Adaptive Learning Paradigm in a Community-Based Adult ESL Literacy Class. In M. Santos & A. Whiteside (Eds.). Low Educated Second Language and Literacy Acquisition. Proceedings of the Ninth Symposium (pps. 151-171). Retrieved Nov. 14, 2020 from https://www.researchgate.net/publication/301355138_Implementing_a_Mutually_Adaptive_Learning_Paradigm_in_a_Community-Based_Adult_ESL_Literacy_Class .

Dixson, M. (2015). Measuring student engagement in the online course: The online student engagement scale (OSE). Online Learning . https://doi.org/10.24059/olj.v19i4.561 .

Dominic, M., Xavier, B., & Francis, S. (2015). A Framework to Formulate Adaptivity for Adaptive e-Learning System Using User Response Theory. International Journal of Modern Education and Computer Science, 7 , 23. https://doi.org/10.5815/ijmecs.2015.01.04 .

El Bachari, E., Abdelwahed, E., & M., El. . (2011). E-Learning personalization based on Dynamic learners’ preference. International Journal of Computer Science and Information Technology., 3 , 200–216. https://doi.org/10.5121/ijcsit.2011.3314 .

El-Sabagh, H. A., & Hamed, E. (2020). The Relationship between Learning-Styles and Learning Motivation of Students at Umm Al-Qura University. Egyptian Association for Educational Computer Journal . https://doi.org/10.21608/EAEC.2020.25868.1015 ISSN-Online: 2682-2601.

Ennouamani, S., & Mahani, Z. (2017). An overview of adaptive e-learning systems. Eighth International ConfeRence on Intelligent Computing and Information Systems (ICICIS) . https://doi.org/10.1109/INTELCIS.2017.8260060 .

Evans, S., Steele, J., Robertson, S., & Dyer, D. (2017). Personalizing post titles in the online classroom: A best practice? Journal of Educators Online, 14 (2), 46–54.

Fleming, N., & Baume, D. (2006). Learning styles again: VARKing up the Right Tree! Educational Developments, 7 , 4–7.

Franzoni, A., & Assar, S. (2009). Student learning style adaptation method based on teaching strategies and electronic media. Journal of Educational Technology & Society , 12(4), 15–29. Retrieved March 21, 2020, from http://www.jstor.org/stable/jeductechsoci.12.4.15 .

Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School Engagement: Potential of the Concept . State of the Evidence: Review of Educational Research. https://doi.org/10.3102/00346543074001059 .

Book   Google Scholar  

Gaytan, J., & McEwen, M. (2007). Effective Online Instructional and Assessment Strategies. American Journal of Distance Education, 21 (3), 117–132. https://doi.org/10.1080/08923640701341653 .

Graf, S. & Kinshuk. K. (2007). Providing Adaptive Courses in Learning Management Systems with respect to Learning Styles. Proceeding of the World Conference on eLearning in Corporate. Government. Healthcare. and Higher Education (2576–2583). Association for the Advancement of Computing in Education (AACE). Retrieved January 18, 2020 from  https://www.learntechlib.org/primary/p/26739/ . ISBN 978-1-880094-63-1.

Guo, P., Kim, V., & Rubin, R. (2014). How video production affects student engagement: an empirical study of MOOC videos. Proceedings of First ACM Conference on Learning @ Scale Confernce . March 2014, (pp. 41-50). https://doi.org/10.1145/2556325.2566239 .

Hinton, P. R., Brownlow, C., McMurray, I., & Cozens, B. (2014). SPSS Explained (2nd ed., pp. 339–354). Routledge Taylor & Francis Group.

Hong, S. (2009). Developing competency model of learners in distance universities. Journal of Educational Technology., 25 , 157–186.

Hussain, I. (2017). Pedagogical implications of VARK model of learning. Journal of Literature, Languages and Linguistics, 38 , 33–37.

Hussain, M., Zhu, W., Zhang, W., & Abidi, S. (2018). Student engagement predictions in an e-learning system and their impact on student course assessment scores. Computational Intelligence, and Neuroscience. https://doi.org/10.1155/2018/6347186 .

Hussein, A., & Al-Chalabi, H. (2020). Pedagogical Agents in an Adaptive E-learning System. SAR Journal of Science and Research., 3 , 24–30. https://doi.org/10.18421/SAR31-04 .

Jaleel, S., & Thomas, A. (2019). Learning styles theories and implications for teaching learning . Horizon Research Publishing. 978-1-943484-25-6.

Johnson, M. (2009). Evaluation of Learning Style for First-Year Medical Students. Int J Schol Teach Learn . https://doi.org/10.20429/ijsotl.2009.030120 .

Jonassen, D. H., & Grabowski, B. L. (2012). Handbook of individual differences, learning, and instruction. Routledge . https://doi.org/10.1016/0022-4405(95)00013-C .

Klasnja-Milicevic, A., Vesin, B., Ivanovic, M., & Budimac, Z. (2011). E-Learning personalization based on hybrid recommendation strategy and learning style identification. Computers & Education, 56 (3), 885–899. https://doi.org/10.1016/j.compedu.2010.11.001 .

Kolekar, S. V., Pai, R. M., & Manohara Pai, M. M. (2017). Prediction of learner’s profile based on learning styles in adaptive e-learning system. International Journal of Emerging Technologies in Learning, 12 (6), 31–51. https://doi.org/10.3991/ijet.v12i06.6579 .

Lee, J., & Kim, D. (2012). Adaptive learning system applied bruner’ EIS theory. International Conference on Future Computer Supported Education, IERI Procedia, 2 , 794–801. https://doi.org/10.1016/j.ieri.2012.06.173 .

Lee, J., Song, H.-D., & Hong, A. (2019). Exploring factors, and indicators for measuring students’ sustainable engagement in e-learning. Sustainability, 11 , 985. https://doi.org/10.3390/su11040985 .

Leung, A., McGregor, M., Sabiston, D., & Vriliotis, S. (2014). VARK learning styles and student performance in principles of Micro-vs. Macro-Economics. Journal of Economics and Economic Education Research, 15 (3), 113.

Lévy, P. & Wakabayashi, N. (2008). User's appreciation of engagement in service design: The case of food service design. Proceedings of International Service Innovation Design Conference 2008 - ISIDC08 . Busan, Korea. Retrieved October 28, 2019 from https://www.researchgate.net/publication/230584075 .

Liang, J. S. (2012). The effects of learning styles and perceptions on application of interactive learning guides for web-based. Proceedings of Australasian Association for Engineering Education Conference AAEE . Melbourne, Australia. Retrieved October 22, 2019 from https://aaee.net.au/wpcontent/uploads/2018/10/AAEE2012-Liang.-Learning_styles_and_perceptions_effects_on_interactive_learning_guide_application.pdf .

Mahnane, L., Laskri, M. T., & Trigano, P. (2013). A model of adaptive e-learning hypermedia system based on thinking and learning styles. International Journal of Multimedia and Ubiquitous Engineering, 8 (3), 339–350.

Markey, M. K. & Schmit, K, J. (2008). Relationship between learning style Preference and instructional technology usage. Proceedings of American Society for Engineering Education Annual Conference & Expodition . Pittsburgh, Pennsylvania. Retrieved March 15, 2020 from https://peer.asee.org/3173 .

McMillan, J., & Schumacher, S. (2006). Research in education: Evidence-based inquiry . Pearson.

Murphy, R., Gray, S., Straja, S., & Bogert, M. (2004). Student learning preferences and teaching implications: Educational methodologies. Journal of Dental Education, 68 (8), 859–866.

Murray, M., & Pérez, J. (2015). Informing and performing: A study comparing adaptive learning to traditional learning. Informing Science. The International Journal of an Emerging Transdiscipline , 18, 111–125. Retrieved Febrauary 4, 2021 from http://www.inform.nu/Articles/Vol18/ISJv18p111-125Murray1572.pdf .

Mutahi, J., Kinai, A. , Bore, N. , Diriye, A. and Weldemariam, K. (2017). Studying engagement and performance with learning technology in an African classroom, Proceedings of Seventh International Learning Analytics & Knowledge Conference , (pp. 148–152), Canada: Vancouver.

Nainie, Z., Siraj, S., Abuzaiad, R. A., & Shagholi, R. (2010). Hypothesized learners’ technology preferences based on learning styles dimensions. The Turkish Online Journal of Educational Technology, 9 (4), 83–93.

Naqeeb, H. (2011). Learning Styles as Perceived by Learners of English as a Foreign Language in the English Language Center of The Arab American University—Jenin. Palestine. an Najah Journal of Research, 25 , 2232.

Nkomo, L. M., Daniel, B. K., & Butson, R. J. (2021). Synthesis of student engagement with digital technologies: a systematic review of the literature. International Journal of Educational Technology in Higher Education . https://doi.org/10.1186/s41239-021-00270-1 .

Normadhi, N. B., Shuib, L., Nasir, H. N. M., Bimba, A., Idris, N., & Balakrishnan, V. (2019). Identification of personal traits in adaptive learning environment: Systematic literature review. Computers & Education, 130 , 168–190. https://doi.org/10.1016/j.compedu.2018.11.005 .

Nuankaew, P., Nuankaew, W., Phanniphong, K., Imwut, S., & Bussaman, S. (2019). Students model in different learning styles of academic achievement at the University of Phayao, Thailand. International Journal of Emerging Technologies in Learning (iJET)., 14 , 133. https://doi.org/10.3991/ijet.v14i12.10352 .

Oxman, S. & Wong, W. (2014). White Paper: Adaptive Learning Systems. DV X Innovations DeVry Education Group. Retrieved December 14, 2020 from shorturl.at/hnsS8 .

Ozyurt, Ö., & Ozyurt, H. (2015). Learning style-based individualized adaptive e-learning environments: Content analysis of the articles published from 2005 to 2014. Computers in Human Behavior, 52 , 349–358. https://doi.org/10.1016/j.chb.2015.06.020 .

Pardo, A., Han, F., & Ellis, R. (2016). Exploring the relation between self-regulation, online activities, and academic performance: a case study. Proceedings of Sixth International Conference on Learning Analytics & Knowledge , (pp. 422-429). https://doi.org/10.1145/2883851.2883883 .

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: concepts and evidence. Psychology Faculty Publications., 9 (3), 105–119. https://doi.org/10.1111/j.1539-6053.2009.01038.x .

Qazdar, A., Cherkaoui, C., Er-Raha, B., & Mammass, D. (2015). AeLF: Mixing adaptive learning system with learning management system. International Journal of Computer Applications., 119 , 1–8. https://doi.org/10.5120/21140-4171 .

Robinson, C., & Hullinger, H. (2008). New benchmarks in higher education: Student engagement in online learning. Journal of Education for Business, 84 , 101–109.

Rogers-Stacy, C., Weister, T., & Lauer, S. (2017). Nonverbal immediacy behaviors and online student engagement: Bringing past instructional research into the present virtual classroom. Communication Education, 66 (1), 37–53.

Roy, S., & Roy, D. (2011). Adaptive e-learning system: a review. International Journal of Computer Trends and Technology (IJCTT), 1 (1), 78–81. ISSN:2231-2803.

Shi, L., Cristea, A., Foss, J., Qudah, D., & Qaffas, A. (2013). A social personalized adaptive e-learning environment: a case study in topolor. IADIS International Journal on WWW/Internet., 11 , 13–34.

Shih, M., Feng, J., & Tsai, C. (2008). Research and trends in the field of e-learning from 2001 to 2005: A content analysis of cognitive studies in selected journals. Computers & Education, 51 (2), 955–967. https://doi.org/10.1016/j.compedu.2007.10.004 .

Silva, A. (2020). Towards a Fuzzy Questionnaire of Felder and Solomon for determining learning styles without dichotomic in the answers. Journal of Learning Styles, 13 (15), 146–166.

Staikopoulos, A., Keeffe, I., Yousuf, B. et al., (2015). Enhancing student engagement through personalized motivations. Proceedings of IEEE 15th International Conference on Advanced Learning Technologies , (pp. 340–344), Taiwan: Hualien. https://doi.org/10.1109/ICALT.2015.116 .

Surjono, H. D. (2014). The evaluation of Moodle-based adaptive e-learning system. International Journal of Information and Education Technology, 4 (1), 89–92. https://doi.org/10.7763/IJIET.2014.V4.375 .

Truong, H. (2016). Integrating learning styles and adaptive e-learning system: current developments, problems, and opportunities. Computers in Human Behavior, 55 (2016), 1185–1193. https://doi.org/10.1016/j.chb.2015.02.014 .

Umm Al-Qura University Agency for Educational Affairs (2020). Common first-year Deanship, at Umm Al-Qura University. Retrieved February 3, 2020 from https://uqu.edu.sa/en/pre-edu/70021 .

Vassileva, D. (2012). Adaptive e-learning content design and delivery based on learning style and knowledge level. Serdica Journal of Computing, 6 , 207–252.

Veiga, F., Robu, V., Appleton, J., Festas, I & Galvao, D. (2014). Students' engagement in school: Analysis according to self-concept and grade level. Proceedings of EDULEARN14 Conference 7th-9th July 2014 (pp. 7476-7484). Barcelona, Spain. Available Online at: http://hdl.handle.net/10451/12044 .

Velázquez, A., & Assar, S. (2009). Student learning styles adaptation method based on teaching strategies and electronic media. Educational Technology & SocieTy., 12 , 15–29.

Verdú, E., Regueras, L., & De Castro, J. (2008). An analysis of the research on adaptive Learning: The next generation of e-learning. WSEAS Transactions on Information Science and Applications, 6 (5), 859–868.

Willingham, D., Hughes, E., & Dobolyi, D. (2015). The scientific status of learning styles theories. Teaching of Psychology., 42 (3), 266–271. https://doi.org/10.1177/0098628315589505 .

Yalcinalp & Avcı. (2019). Creativity and emerging digital educational technologies: A systematic review. The Turkish Online Journal of Educational Technology, 18 (3), 25–45.

Yang, J., Huang, R., & Li, Y. (2013). Optimizing classroom environment to support technology enhanced learning. In A. Holzinger & G. Pasi (Eds.), Human-computer interaction and knowledge discovery in complex (pp. 275–284). Berlin: Springer.

Zhang, H. (2017). Accommodating different learning styles in the teaching of economics: with emphasis on fleming and mills¡¯s sensory-based learning style typology. Applied Economics and Finance, 4 (1), 72–78.

Download references

Acknowledgements

The author would like to thank the Deanship of Scientific Research at Umm Al-Qura University for the continuous support. This work was supported financially by the Deanship of Scientific Research at Umm Al-Qura University to Dr.: Hassan Abd El-Aziz El-Sabagh. (Grant Code: 18-EDU-1-01-0001).

Author information

Hassan A. El-Sabagh is an assistant professor in the E-Learning Deanship and head of the Instructional Programs Department, Umm Al-Qura University, Saudi Arabia, where he has worked since 2012. He has extensive experience in the field of e-learning and educational technologies, having served primarily at the Educational Technology Department of the Faculty of Specific Education, Mansoura University, Egypt since 1997. In 2011, he earned a Ph.D. in Educational Technology from Dresden University of Technology, Germany. He has over 14 papers published in international journals/conference proceedings, as well as serving as a peer reviewer in several international journals. His current research interests include eLearning Environments Design, Online Learning; LMS-based Interactive Tools, Augmented Reality, Design Personalized & Adaptive Learning Environments, and Digital Education, Quality & Online Courses Design, and Security issues of eLearning Environments. (E-mail: [email protected]; [email protected]).

Authors and Affiliations

E-Learning Deanship, Umm Al-Qura University, Mecca, Saudi Arabia

Hassan A. El-Sabagh

Faculty of Specific Education, Mansoura University, Mansoura, Egypt

You can also search for this author in PubMed   Google Scholar

Contributions

The author read and approved the final manuscript.

Corresponding author

Correspondence to Hassan A. El-Sabagh .

Ethics declarations

Competing interests.

The author declares that there is no conflict of interest

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

El-Sabagh, H.A. Adaptive e-learning environment based on learning styles and its impact on development students' engagement. Int J Educ Technol High Educ 18 , 53 (2021). https://doi.org/10.1186/s41239-021-00289-4

Download citation

Received : 24 May 2021

Accepted : 19 July 2021

Published : 01 October 2021

DOI : https://doi.org/10.1186/s41239-021-00289-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Adaptive e-Learning
  • Learning style
  • Learning impact

recent research on learning styles

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Differentiating the learning styles of college students in different disciplines in a college English blended learning setting

Roles Conceptualization, Formal analysis, Funding acquisition, Methodology, Supervision, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliations Department of Linguistics, School of International Studies, Zhejiang University, Hangzhou City, Zhejiang Province, China, Center for College Foreign Language Teaching, Zhejiang University, Hangzhou City, Zhejiang Province, China, Institute of Asian Civilizations, Zhejiang University, Hangzhou City, Zhejiang Province, China

ORCID logo

Roles Formal analysis, Project administration, Writing – review & editing

Affiliation Department of Linguistics, School of International Studies, Zhejiang University, Hangzhou City, Zhejiang Province, China

Roles Formal analysis, Writing – original draft

Roles Writing – review & editing

  • Jie Hu, 
  • Yi Peng, 
  • Xueliang Chen, 

PLOS

  • Published: May 20, 2021
  • https://doi.org/10.1371/journal.pone.0251545
  • Peer Review
  • Reader Comments

Fig 1

Learning styles are critical to educational psychology, especially when investigating various contextual factors that interact with individual learning styles. Drawing upon Biglan’s taxonomy of academic tribes, this study systematically analyzed the learning styles of 790 sophomores in a blended learning course with 46 specializations using a novel machine learning algorithm called the support vector machine (SVM). Moreover, an SVM-based recursive feature elimination (SVM-RFE) technique was integrated to identify the differential features among distinct disciplines. The findings of this study shed light on the optimal feature sets that collectively determined students’ discipline-specific learning styles in a college blended learning setting.

Citation: Hu J, Peng Y, Chen X, Yu H (2021) Differentiating the learning styles of college students in different disciplines in a college English blended learning setting. PLoS ONE 16(5): e0251545. https://doi.org/10.1371/journal.pone.0251545

Editor: Haoran Xie, Lingnan University, HONG KONG

Received: May 15, 2020; Accepted: April 29, 2021; Published: May 20, 2021

Copyright: © 2021 Hu et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All relevant data are within the paper and its Supporting Information files.

Funding: This research was supported by the Philosophical and Social Sciences Planning Project of Zhejiang Province in 2020 [grant number 20NDJC01Z] with the recipient Jie Hu, Second Batch of 2019 Industry-University Collaborative Education Project of Chinese Ministry of Education [grant number 201902016038] with the recipient Jie Hu, SUPERB College English Action Plan with the recipient Jie Hu, and the Fundamental Research Funds for the Central Universities of Zhejiang University with the recipient Jie Hu.

Competing interests: The authors have declared that no competing interests exist.

Introduction

Research background.

Learning style, as an integral and vital part of a student’s learning process, has been constantly discussed in the field of education and pedagogy. Originally developed from the field of psychology, psychological classification, and cognitive research several decades ago [ 1 ], the term “learning style” is generally defined as the learner’s innate and individualized preference for ways of participation in learning practice [ 2 ]. Theoretically, learning style provides a window into students’ learning processes [ 3 , 4 ], predicts students’ learning outcomes [ 5 , 6 ], and plays a critical role in designing individualized instruction [ 7 ]. Knowing a student’s learning style and personalizing instruction to students’ learning style could enhance their satisfaction [ 8 ], improve their academic performance [ 9 ], and even reduce the time necessary to learn [ 10 ].

Researchers in recent years have explored students’ learning styles from various perspectives [ 11 – 13 ]. However, knowledge of the learning styles of students from different disciplines in blended learning environments is limited. In an effort to address this gap, this study aims to achieve two major objectives. First, it investigates how disciplinary background impacts students’ learning styles in a blended learning environment based on data collected in a compulsory college English course. Students across 46 disciplines were enrolled in this course, providing numerous disciplinary factor resources for investigating learning styles. Second, it introduces a novel machine learning method named the SVM to the field of education to identify an optimal set of factors that can simultaneously differentiate students of different academic disciplines. Based on data for students from 46 disciplines, this research delves into the effects of a massive quantity of variables related to students’ learning styles with the help of a powerful machine learning algorithm. Considering the convergence of a wide range of academic disciplines and the detection of latent interactions between a large number of variables, this study aims to provide a clear picture of the relationship between disciplinary factors and students’ learning styles in a blended learning setting.

Literature review

Theories of learning styles..

Learning style is broadly defined as the inherent preferences of individuals as to how they engage in the learning process [ 2 ], and the “cognitive, affective and physiological traits” of students have received special attention [ 14 ]. To date, there has been a proliferation of learning style definitions proposed to explain people’s learning preferences, each focusing on different aspects. Efforts to dissect learning style have been contested, with some highlighting the dynamic process of the learner’s interaction with the learning environment [ 14 ] and others underlining the individualized ways of information processing [ 15 ]. One vivid explication involved the metaphor of an onion, pointing out the multilayer nature of learning styles. It was proposed that the outermost layer of the learning style could change in accordance with the external environment, while the inner layer is relatively stable [ 16 , 17 ]. In addition, a strong concern in this field during the last three decades has led to a proliferation of models that are germane to learning styles, including the Kolb model [ 18 ], the Myers-Briggs Type Indicator model [ 19 ] and the Felder-Silverman learning style model (FSLSM) [ 20 ]. These learning style models have provided useful analytical lenses for analyzing students’ learning styles. The Kolb model focuses on learners’ thinking processes and identifies four types of learning, namely, diverging, assimilating, converging, and accommodating [ 18 ]. The Myers-Briggs Type Indicator model classifies learners into extraversion and introversion types, with the former preferring to learn from interpersonal communication and the latter inclining to benefit from personal experience [ 19 ]. As the most popular available model, the FSLSM identifies eight categories of learners according to the four dimensions of perception, input, processing and understanding [ 20 ]. In contrast to other learning style models that divided students into only a few groups, the FSLSM describes students’ learning styles in a more detailed manner. The four paired dimensions delicately distinguish students’ engagement in the learning process, providing a solid basis for a steady and reliable learning style analysis [ 21 ]. In addition, it has been argued that the FSLSM is the most appropriate model for a technology-enhanced learning environment because it involves important theories of cognitive learning behaviors [ 22 , 23 ]. Therefore, a large number of scholars have based their investigations of students’ learning styles in the e-learning/computer-aided learning environment on FSLSM [ 24 – 28 ].

Learning styles and FSLSM.

Different students receive, process, and respond to information with different learning styles. A theoretical model of learning style can be used to categorize people according to their idiosyncratic learning styles. In this study, the FSLSM was adopted as a theoretical framework to address the collective impacts of differences in students’ learning styles across different disciplines (see Fig 1 ).

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

This model specifies the four dimensions of the construct of learning style: visual/verbal, sensing/intuitive, active/reflective, and sequential/global. These four dimensions correspond to four psychological processes: input, perception, processing, and understanding.

https://doi.org/10.1371/journal.pone.0251545.g001

The FSLSM includes learning styles scattered among four dimensions.

Visual learners process information best when it is presented as graphs, pictures, etc., while verbal learners prefer spoken cues and remember best what they hear. Sensory learners like working with facts, data, and experimentation, while intuitive learners prefer abstract principles and theories. Active learners like to try things and learn through experimentation, while reflective learners prefer to think things through before taking action. Sequential learners absorb knowledge in a linear fashion and make progress step by step, while global learners tend to grasp the big picture before filling in all the details.

Learning styles and academic disciplines.

Learning styles vary depending on a series of factors, including but not limited to age [ 29 ], gender [ 30 ], personality [ 2 , 31 ], learning environment [ 32 ] and learning experience [ 33 ]. In the higher education context, the academic discipline seems to be an important variable that influences students’ distinctive learning styles, which echoes a multitude of investigations [ 29 , 34 – 41 ]. One notable study explored the learning styles of students from 4 clusters of disciplines in an academic English language course and proposed that the academic discipline is a significant predictor of students’ learning styles, with students from the soft-pure, soft-applied, hard-pure and hard-applied disciplines each favoring different learning modes [ 42 ]. In particular, researchers used the Inventory of Learning Styles (ILS) questionnaire and found prominent disparities in learning styles between students from four different disciplinary backgrounds in the special educational field of vocational training [ 43 ]. These studies have found significant differences between the learning styles of students from different academic disciplines, thus supporting the concept that learning style could be domain dependent.

Learning styles in an online/blended learning environment.

Individuals’ learning styles reflect their adaptive orientation to learning and are not fixed personality traits. Consequently, learning styles can vary among diverse contexts, and related research in different contexts is vital to understanding learning styles in greater depth. Web-based technologies eliminate barriers of space and time and have become integrated in individuals’ daily lives and learning habits. Online and blended learning have begun to pervade virtually every aspect of the education landscape [ 40 ], and this warrants close attention. In addition to a series of studies that reflected upon the application of information and communication technology in the learning process [ 44 , 45 ], recent studies have found a mixed picture of whether students in a web-based/blended learning environment have a typical preference for learning.

Online learning makes it possible for students to set their goals and develop an individualized study plan, equipping them with more learning autonomy [ 46 ]. Generally, students with a more independent learning style, greater self-regulating behavior and stronger self-efficacy are found to be more successful in an online environment [ 47 ]. For now, researchers have made substantial contributions to the identification and prediction of learning styles in an online learning environment [ 27 , 48 – 51 ]. For instance, an inspiring study focused on the manifestation of college students’ learning styles in a purely computer-based learning environment to evaluate the different learning styles of web-learners in the online courses, indicating that students’ learning styles were significantly related to online participation [ 49 ]. Students’ learning styles in interactive E-learning have also been meticulously investigated, from which online tutorials have been found to be contributive to students’ academic performance regardless of their learning styles [ 51 ].

As a flexible learning method, blended courses have combined the advantages of both online learning and traditional teaching methods [ 52 ]. Researchers have investigated students’ learning styles within this context and have identified a series of prominent factors, including perceived satisfaction and technology acceptance [ 53 ], the dynamics of the online/face-to-face environment [ 54 ], and curriculum design [ 55 ]. Based on the Visual, Aural, Reading or Write and Kinesthetic model, a comprehensive study scrutinized the learning styles of K12 students in a blended learning environment, elucidating the effect of the relationship between personality, learning style and satisfaction on educational outcomes [ 56 ]. A recent study underscored the negative effects of kinesthetic learning style, whereas the positive effects of visual or auditory learning styles on students’ academic performance, were also marked in the context of blended learning [ 57 ].

Considering that academic disciplines and learning environment are generally regarded as essential predictors of students’ learning styles, some studies have also concentrated on the effects of academic discipline in a blended learning environment. Focusing on college students’ learning styles in a computer-based learning environment, an inspiring study evaluated the different learning styles of web learners, namely, visual, sensing, global and sequential learners, in online courses. According to the analysis, compared with students from other colleges, liberal arts students, are more susceptible to the uneasiness that may result from remote teaching because of their learning styles [ 11 ]. A similar effort was made with the help of the CMS tool usage logs and course evaluations to explore the learning styles of disciplinary quadrants in the online learning environment. The results indicated that there were noticeable differences in tool preferences between students from different domains [ 12 ]. In comparison, within the context of blended learning, a comprehensive study employed chi-square statistics on the basis of the Community of Inquiry (CoI) presences framework, arguing that soft-applied discipline learners in the blended learning environment prefer the kinesthetic learning style, while no correlations between the learning style of soft-pure and hard-pure discipline students and the CoI presences were identified. However, it is noted that students’ blended learning experience depends heavily on academic discipline, especially for students in hard-pure disciplines [ 13 ].

Research gaps and research questions

Overall, the research seems to be gaining traction, and new perspectives are continually introduced. The recent literature on learning styles mostly focuses on the exploration of the disciplinary effects on the variation in learning styles, and some of these studies were conducted within the blended environment. However, most of the studies focused only on several discrete disciplines or included only a small group of student samples [ 34 – 41 ]. Data in these studies were gathered through specialized courses such as academic English language [ 42 ] rather than the compulsory courses available to students from all disciplines. Even though certain investigations indeed boasted a large number of samples [ 49 ], the role of teaching was emphasized rather than students’ learning style. In addition, what is often overlooked is that a large number of variables related to learning styles could distinguish students from different academic disciplines in a blended learning environment, whereas a more comprehensive analysis that takes into consideration the effects of a great quantity of variables related to learning styles has remained absent. Therefore, one goal of the present study is to fill this gap and shed light on this topic.

Another issue addressed in this study is the selection of an optimal measurement that can effectively identify and differentiate individual learning styles [ 58 ]. The effective identification and differentiation of individual learning styles can not only help students develop greater awareness of their learning but also provide teachers with the necessary input to design tailor-made instructions in pedagogical practice. Currently, there are two general approaches to identify learning styles: a literature-based approach and a data-driven approach. The literature-based approach tends to borrow established rules from the existing literature, while the data-driven approach tends to construct statistical models using algorithms from fields such as machine learning, artificial intelligence, and data mining [ 59 ]. Research related to learning styles has been performed using predominantly traditional instruments, such as descriptive statistics, Spearman’s rank correlation, coefficient R [ 39 ], multivariate analysis of variance [ 56 ] and analysis of variance (ANOVA) [ 38 , 43 , 49 , 57 ]. Admittedly, these instruments have been applied and validated in numerous studies, in different disciplines, and across multiple timescales. Nevertheless, some of the studies using these statistical tools did not identify significant results [ 36 , 53 , 54 ] or reached only loose conclusions [ 60 ]; this might be because of the inability of these methods to probe into the synergistic effects of variables. However, the limited functions of comparison, correlation, prediction, etc. are being complemented by a new generation of technological innovations that promise more varied approaches to addressing social and scientific issues. Machine learning is one such approach that has received much attention both in academia and beyond. As a subset of artificial intelligence, machine learning deals with algorithms and statistical models on computer systems, performing tasks based on patterns and inference instead of explicit instruction. As such, it can deal with high volumes of data at the same time, perform tasks automatically and independently, and continuously improve its performance based on past experience [ 54 ]. Similar machine learning approaches have been proposed and tested by different scholars to identify students’ learning styles, with varying results regarding the classification of learning styles. For instance, a study that examined the precision levels of four computational intelligence approaches, i.e., artificial neural network, genetic algorithm, ant colony system and particle swarm optimization, found that the average precision of learning style differentiation ranged between 66% and 77% [ 61 ]. Another study that classified learning styles through SVM reported accuracy levels ranging from 53% to 84% [ 62 ]. A comparison of the prediction performance of SVM and artificial neural networks found that SVM has higher prediction accuracy than the latter [ 63 ]. This was further supported by another study, which yielded a similar result between SVM and the particle swarm optimization algorithm [ 64 ]. Moreover, when complemented by a genetic algorithm [ 65 ] and ant colony system [ 66 ], SVM has also shown improved results. These findings across different fields point to the reliability of SVM as an effective statistical tool for identification and differentiation analysis.

Therefore, a comprehensive investigation across the four general disciplines in Biglan’s taxonomy using a strong machine learning approach is needed. Given the existence of the research gaps discussed above, this exploratory study seeks to address the following questions:

  • Can students’ learning styles be applied to differentiate various academic disciplines in the blended learning setting? If so, what are the differentiability levels among different academic disciplines based on students’ learning styles?
  • What are the key features that can be selected to determine the collective impact on differentiation by a machine learning algorithm?
  • What are the collective impacts of optimal feature sets?

Materials and methods

This study adopted a quantitative approach for the analysis. First, a modified and translated version of the original ILS questionnaire was administered to collect scores for students’ learning styles. Then, two alternate data analyses were performed separately. One analysis involved a traditional ANOVA, which tested the main effect of discipline on students’ learning styles in each ILS dimension. The other analysis involved the support vector machine (SVM) technique to test its performance in classifying students’ learning styles in the blended learning course among 46 specializations. Then, SVM-based recursive feature elimination (SVM-RFE) was employed to specify the impact of students’ disciplinary backgrounds on their learning styles in blended learning. By referencing the 44 questions (operationalized as features in this study) in the ILS questionnaire, SVM-RFE could rank these features based on their relative importance in differentiating different disciplines and identify the key features that collectively differentiate the students’ learning style. These steps are intended to not only identify students’ learning style differences but also explain such differences in relation to their academic disciplinary backgrounds.

Participants

The participants included 790 sophomores taking the blended English language course from 46 majors at Z University. Sophomore students were selected for this study for two reasons. First, sophomores are one of the only two groups of students (the other group being college freshmen) who take a compulsory English language course, namely, the College English language course. Second, of these two groups of students, sophomores have received academic discipline-related education, while their freshmen counterparts have not had disciplinary training during the first year of college. In the College English language course, online activities, representing 55% of the whole course, include e-course teaching designed by qualified course teachers or professors, courseware usage for online tutorials, forum discussion and essay writing, and two online quizzes. Offline activities, which represent 45% of the whole course, include role-playing, ice-breaker activities, group presentations, an oral examination, and a final examination. Therefore, the effects of the academic discipline on sophomores’ learning styles might be sufficiently salient to warrant a comparison in a blended learning setting [ 67 ]. Among the participants, 420 were male, and 370 were female. Most participants were aged 18 to 19 years and had taken English language courses for at least 6 years. Based on Biglan’s typology of disciplinary fields, the students’ specializations were classified into the four broad disciplines of hard-applied (HA, 289/37.00%), hard-pure (HP, 150/19.00%), soft-applied (SA, 162/20.00%), and soft-pure (SP, 189/24.00%).

Biglan’s classification scheme of academic disciplines (hard (H) vs. soft (S) disciplines and pure (P) vs. applied (A) disciplines) has been credited as the most cited organizational system of academic disciplines in tertiary education [ 68 – 70 ]. Many studies have also provided evidence supporting the validity of this classification [ 69 ]. Over the years, research has indicated that Biglan’s typology is correlated with differences in many other properties and serves as an appropriate mechanism to organize discipline-specific knowledge or epistemologies [ 38 ] and design and deliver courses for students with different learning style preferences [ 41 ]. Therefore, this classification provides a convenient framework to explore differences across disciplinary boundaries. In general, HA disciplines include engineering, HP disciplines include the so-called natural sciences, SA disciplines include the social sciences, and SP disciplines include the humanities [ 41 , 68 , 71 ].

In learning style research, it is difficult to select an instrument to measure the subjects’ learning styles [ 72 ]. The criteria used for the selection of a learning style instrument in this study include the following: 1) successful use of the instrument in previous studies, 2) demonstrated validity and reliability, 3) a match between the purpose of the instrument and the aim of this study and 4) open access to the questionnaire.

The Felder and Soloman’s ILS questionnaire, which was built based on the FSLSM, was adopted in the present study to investigate students’ learning styles across different disciplines. First, the FSLSM is recognized as the most commonly used model for measuring individual learning styles on a general scale [ 73 ] in higher education [ 74 ] and has remained popular for many years across different disciplines in university settings and beyond. In the age of personalized instruction, this model has breathed new life into areas such as blended learning [ 75 ], online distance learning [ 76 ], courseware design [ 56 ], and intelligent tutoring systems [ 77 , 78 ]. Second, the FSLSM is based on previous learning style models; the FSLSM integrates all their advantages and is, thus, more comprehensive in delineating students’ learning styles [ 79 , 80 ]. Third, the FSLSM has a good predictive ability with independent testing sets (i.e., unknown learning style objects) [ 17 ], which has been repeatedly proven to be a more accurate, reliable, and valid model than most other models for predicting students’ learning performance [ 10 , 80 ]. Fourth, the ILS is a free instrument that can be openly accessed online (URL: https://www.webtools.ncsu.edu/learningstyles/ ) and has been widely used in the research context [ 81 , 82 ].

The modified and translated version of the original ILS questionnaire includes 44 questions in total, and 11 questions correspond to each dimension of the Felder-Silverman model as follows: questions 1–11 correspond to dimension 1 (active vs. reflective), questions 12–22 correspond to dimension 2 (sensing vs. intuitive), questions 23–33 correspond to dimension 3 (visual vs. verbal), and questions correspond 34–44 to dimension 4 (sequential vs. global). Each question is followed by five choices on a five-point Likert scale ranging from “strongly agree with A (1)”, “agree with A (2)”, “neutral (3)”, “agree with B (4)” and “strongly agree with B (5)”. Option A and option B represent the two choices offered in the original ILS questionnaire.

Ethics statements

The free questionnaires were administered in a single session by specialized staff who collaborated on the investigation. The participants completed all questionnaires individually. The study procedures were in accordance with the ethical standards of the Helsinki Declaration and were approved by the Ethics Committee of the School of International Studies, Zhejiang University. All participants signed written informed consent to authorize their participation in this research. After completion of the informed consent form, each participant was provided a gift (a pen) in gratitude for their contribution and participation.

Data collection procedure

Before the questionnaires were distributed, the researchers involved in this study contacted faculty members from various departments and requested their help. After permission was given, the printed questionnaires were administered to students under the supervision of their teachers at the end of their English language course. The students were informed of the purpose and importance of the study and asked to carefully complete the questionnaires. The students were also assured that their personal information would be used for research purposes only. All students provided written informed consent (see S2 File ). After the questionnaires were completed and returned, they were thoroughly examined by the researchers such that problematic questionnaires could be identified and excluded from further analysis. All questionnaires eligible for the data analysis had to meet the following two standards: first, all questions must be answered, and second, the answered questions must reflect a reasonable logic. Regarding the few missing values, the median number of a given individual’s responses on 11 questions per dimension included in the ILS questionnaire was used to fill the void in each case. In statistics, using the median number to impute missing values is common and acceptable because missing values represent only a small minority of the entire dataset and are assumed to not have a large impact on the final results [ 83 , 84 ].

In total, 850 questionnaires were administered to the students, and 823 of these questionnaires were retrieved. Of the retrieved questionnaires, the remaining 790 questionnaires were identified as appropriate for further use. After data screening, these questionnaires were organized, and their respective results were translated into an Excel format.

Data analysis method

During the data analysis, as a library of the SVM, the free package LIBSVM ( https://www.csie.ntu.edu.tw/~cjlin/libsvm/ ) was first applied as an alternative method of data analysis. Then, a traditional ANOVA was performed to examine whether there was a main effect of academic discipline on Chinese students’ learning styles. ANOVA could be performed using SPSS, a strong data analysis software that supports a series of statistical analyses. In regard to the examination of the effect of a single or few independent variables, SPSS ANOVA can produce satisfactory results. However, SVM, a classic data mining algorithm, outperforms ANOVA for dataset in which a large number of variables with multidimensions are intertwined and their combined/collective effects influence the classification results. In this study, the research objective was to efficiently differentiate and detect the key features among the 44 factors. Alone, a single factor or few factors might not be significant enough to discriminate the learning styles among the different disciplines. Selected by the SVM, the effects of multiple features may collectively enhance the classification performance. Therefore, the reason for selecting SVM over ANOVA is that in the latter case, the responses on all questions in a single dimension are summed instead of treated as individual scores; thus, the by-item variation is concealed. In addition, the SVM is especially suitable for statistical analysis with high-dimensional factors (usually > 10; 44-dimensional factors were included in this study) and can detect the effects collectively imposed by a feature set [ 85 ].

Originally proposed in 1992 [ 86 ], the SVM is a supervised learning model related to machine learning algorithms that can be used for classification, data analysis, pattern recognition, and regression analysis. The SVM is an efficient classification model that optimally divides data into two categories and is ranked among the top methods in statistical theory due to its originality and practicality [ 85 ]. Due to its robustness, accurate classification, and prediction performance [ 87 – 89 ], the SVM has high reproducibility [ 90 , 91 ]. Due to the lack of visualization of the computing process of the SVM, the SVM has been described as a “black box” method [ 92 ]; however, future studies in the emerging field of explainable artificial intelligence can help solve this problem and convert this approach to a “glass box” method [ 67 ]. This algorithm has proven to have a solid theoretical foundation and excellent empirical application in the social sciences, including education [ 93 ] and natural language processing [ 94 ]. The mechanism underlying the SVM is also presented in Fig 2 .

thumbnail

Hyperplanes 1 and 2 are two regression lines that divide the data into two groups. Hyperplane 1 is considered the best fitting line because it maximizes the distance between the two groups.

https://doi.org/10.1371/journal.pone.0251545.g002

The SVM contains the following two modules: one module is a general-purpose machine learning method, and the other module is a domain-specific kernel function. The SVM training algorithm is used to build a training model that is then used to predict the category to which a new sample instance belongs [ 95 ]. When a set of training samples is given, each sample is given the label of one of two categories. To evaluate the performance of SVM models, a confusion matrix, which is a table describing the performance of a classifier on a set of test data for which the true values are known, is used (see Table 1 ).

thumbnail

https://doi.org/10.1371/journal.pone.0251545.t001

recent research on learning styles

ACC represents the proportion of true results, including both positive and negative results, in the selected population;

SPE represents the proportion of actual negatives that are correctly identified as such;

SEN represents the proportion of actual positives that are correctly identified as such;

AUC is a ranking-based measure of classification performance that can distinguish a randomly chosen positive example from a randomly chosen negative example; and

F-measure is the harmonic mean of precision (another performance indicator) and recall.

The ACC is a good metric frequently applied to indicate the measurement of classification performance, but the combination of the SPE, SEN, AUC, F-measure and ACC may be a measure of enhanced performance assessment and was frequently applied in current studies [ 96 ]. In particular, the AUC is a good metric frequently applied to validate the measurement of the general performance of models [ 97 ]. The advantage of this measure is that it is invariant to relative class distributions and class-specific error costs [ 98 , 99 ]. Moreover, to some extent, the AUC is statistically consistent and more discriminating than the ACC with balanced and imbalanced real-world data sets [ 100 ], which is especially suitable for unequal samples, such as the HA-HP model in this study. After all data preparations were completed, the data used for the comparisons were extracted separately. First, the processed data of the training set were run by using optimized parameters. Second, the constructed model was used to predict the test set, and the five indicators of the fivefold cross-validation and fivefold average were obtained. Cross-validation is a general validation procedure used to assess how well the results of a statistical analysis generalize to an independent data set, which is used to evaluate the stability of the statistical model. K-fold cross-validation is commonly used to search for the best hyperparameters of SVM to achieve the highest accuracy performance [ 101 ]. In particular, fivefold, tenfold, and leave-one-out cross-validation are typically used versions of k-fold cross-validation [ 102 , 103 ]. Fivefold cross-validation was selected because fivefold validation can generally achieve a good prediction performance [ 103 , 104 ] and has been commonly used as a popular rule of thumb supported by empirical evidence [ 105 ]. In this study, five folds (groups) of subsets were randomly divided from the entire set by the SVM, and four folds (training sample) of these subsets were randomly selected to develop a prediction model, while the remaining one fold (test sample) was used for validation. The above functions were all implemented with Python Programming Language version 3.7.0 (URL: https://www.python.org/ ).

Then, SVM-RFE, which is an embedded feature selection strategy that was first applied to identify differentially expressed genes between patients and healthy individuals [ 106 ], was adopted. SVM-RFE has proven to be more robust to data overfitting than other feature selection techniques and has shown its power in many fields [ 107 ]. This approach works by removing one feature each time with the smallest weight iteratively to a feature rank until a group of highly weighted features were selected. After this feature selection procedure, several SVM models were again constructed based on these selected features. The performance of the new models is compared to that of the original models with all features included. The experimental process is provided in Fig 3 for the ease of reference.

thumbnail

https://doi.org/10.1371/journal.pone.0251545.g003

The classification results produced by SVM and the ranking of the top 20 features produced by SVM-RFE were listed in Table 2 . Twenty variables have been selected in this study for two reasons: a data-based reason and a literature-based reason. First, it is clear that models composed of 20 features generally have a better performance than the original models. The performance of models with more than 20 is negatively influenced. Second, SVM-based studies in the social sciences have identified 20 to 30 features as a good number for an optimal feature set [ 108 ], and 20 features were selected for inclusion in the optimal feature set [ 95 ]. Therefore, in this study, the top 20 features were selected for subsequent analysis, as proposed in previous analyses that yielded accepted measurement rates. These 20 features retained most of the useful information from all 44 factors but with fewer feature numbers, which showed satisfactory representation [ 96 ].

thumbnail

https://doi.org/10.1371/journal.pone.0251545.t002

Results of RQ (1) What are the differentiability levels among different academic disciplines based on students’ learning styles?

To further measure the performance of the differentiability among students’ disciplines, the collected data were examined with the SVM algorithm. As shown in Table 2 , the five performance indicators, namely, the ACC, SPE, SEN, AUC and F-measure, were utilized to measure the SVM models. Regarding the two general performance indicators, i.e., the ACC value and AUC value, the HA-HP, HA-SA, and HA-SP-based models yielded a classification capacity of approximately 70.00%, indicating that the students in these disciplines showed a relatively large difference. In contrast, the models based on the H-S, A-P, HP-SA, HP-SP, and SA-SP disciplines only showed a moderate classification capacity (above 55.00%). This finding suggests that these five SVM models were not as effective as the other three models in differentiating students among these disciplines based on their learning styles. The highest ACC and AUC values were obtained in the model based on the HA-HP disciplines, while the lowest values were obtained in the model based on the HP-SA disciplines. As shown in Table 2 , the AUCs of the different models ranged from 57.76% (HP-SA) to 73.97% (HA-HP).

To compare the results of the SVM model with another statistical analysis, an ANOVA was applied. Prior to the main analysis, the students’ responses in each ILS dimension were summed to obtain a composite score. All assumptions of ANOVA were checked, and no serious violations were observed. Then, an ANOVA was performed with academic discipline as the independent variable and the students’ learning styles as the dependent variable. The results of the ANOVA showed that there was no statistically significant difference in the group means of the students’ learning styles in Dimension 1, F(3, 786) = 2.56, p = .054, Dimension 2, F(3, 786) = 0.422, p = .74, or Dimension 3, F(3, 786) = 0.90, p = .443. However, in Dimension 4, a statistically significant difference was found in the group means of the students’ learning styles, F (3, 786) = 0.90, p = .005. As the samples in the four groups were unbalanced, post hoc comparisons using Scheffé’s method were performed, demonstrating that the means of the students’ learning styles significantly differed only between the HA (M = 31.04, SD = 4.986) and SP (M = 29.55, SD = 5.492) disciplines, 95.00% CI for MD [0.19, 2.78], p = .016, whereas the other disciplinary models showed no significant differences. When compared with the results obtained from the SVM models, the three models (HA-HP, HA-SA, and HA-SP models) presented satisfactory differentiability capability of approximately 70.00% based on the five indicators.

In the case of a significant result, it was difficult to determine which questions were representative of the significant difference. With a nonsignificant result, it was possible that certain questions might be relevant in differentiating the participants. However, this problem was circumvented in the SVM, where each individual question was treated as a variable and a value was assigned to indicate its relative importance in the questionnaire. Using SVM also circumvented the inherent problems with traditional significance testing, especially the reliance on p-values, which might become biased in the case of multiple comparisons [ 109 ].

Results of RQ (2) What are the key features that can be selected to determine the collective impact on differentiation by a machine learning algorithm?

To examine whether the model performance improved as a result of this feature selection procedure, the 20 selected features were submitted to another round of SVM analysis. The same five performance indicators were used to measure the model performance (see Table 2 ). By comparing the performance of the SVM model and that of the SVM-RFE model presented in Table 2 , except for the HA-SP model, all other models presented a similar or improved performance after the feature selection process. In particular, the improvement in the HA-HP and HP-SA models was quite remarkable. For instance, in the HA-HP model, the ACC value increased from 69.32% in the SVM model to 82.59% in the SVM-RFE model, and the AUC score substantially increased from 73.97% in the SVM model to 89.13% in the SVM-RFE model. This finding suggests that the feature selection process refined the model’s classification accuracy and that the 20 features selected, out of all 44 factors, carry substantive information that might be informative for exploring disciplinary differences. Although results for the indicators of the 20 selected features were not very high, all five indicators above 65.00% showed that the model was still representative because only 20 of 44 factors could present the classification capability. Considering that there was a significant reduction in the number of questions used for the model construction in SVM-RFE (compared with those used for the SVM model), the newly identified top 20 features by SVM-RFE were effective enough to preserve the differential ability of all 44 questions. Thus, these newly identified top 20 factors could be recognized as key differential features for distinguishing two distinct disciplines.

To identify these top 20 features in eight models (see Table 2 ), SVM-RFE was applied to rank order all 44 features contained in the ILS questionnaire. To facilitate a detailed understanding of what these features represent, the questions related to the top 20 features in the HA-HP model are listed in Table 3 for ease of reference.

thumbnail

https://doi.org/10.1371/journal.pone.0251545.t003

Results of RQ (3) What are the collective impacts of optimal feature sets?

The collective impacts of optimal feature sets could be interpreted from four aspects, namely, the complexities of students’ learning styles, the appropriate choice of SVM, the ranking of SVM-RFE and multiple detailed comparisons between students from different disciplines. First, the FSLSM considers the fact that students’ learning styles are shaped by a series of factors during the growth process, which intertwine and interact with each other. Considering the complex dynamics of the learning style, selecting an approach that could detect the combined effects of a group of variables is needed. Second, recent years have witnessed the emergence of data mining approaches to explore students learning styles [ 28 , 48 – 50 , 110 ]. Specifically, as one of the top machine learning algorithms, the SVM excels in identifying the combined effects of high-order factors [ 87 ]. In this study, the SVM has proven to perform well in classifying students’ learning styles across different disciplines, with every indicator being acceptable. Third, the combination of SVM with RFE could enable the simultaneous discovery of multiple features that collectively determine classification. Notably, although SVM-FRE could rank the importance of the features, they should be regarded as an entire optimal feature set. In other words, the combination of these 20 features, rather than a single factor, could differentiate students’ learning styles across different academic disciplines. Last but not least, the multiple comparisons between different SVM models of discipline provide the most effective learning style factors, giving researchers clues to the nuanced differences between students’ learning styles. It can be seen that students from different academic disciplines understand, see and reflect things from individualized perspectives. The 20 most effective factors for all models scattered within 1 to 44, verifying students’ different learning styles in 4 dimensions. Therefore, the FSLSM provides a useful and effective tool for evaluating students’ learning styles from a rather comprehensive point of view.

The following discussions address the three research questions explored in the current study.

Levels of differentiability among various academic disciplines based on students’ learning styles with SVM

The results suggest that SVM is an effective approach for classification in the blended learning context in which students with diverse disciplinary backgrounds can be distinguished from each other according to their learning styles. All performance indicators presented in Tables 2 and 3 remain above the baseline of 50.00%, suggesting that between each two disciplines, students’ learning style differences can be identified. To some extent, these differences can be identified with a relatively satisfactory classification capability (e.g., 69.32% of the ACC and 73.97% of the AUC in the HA-HP model shown in Table 2 ). Further support for the SVM algorithm is obtained from the SVM-RFE constructed to assess the rank of the factors’ classification capacity, and all values also remained above the baseline value, while some values reached a relatively high classification capability (e.g., 82.59% of the ACC and 89.13% of the AUC in the HA-HP model shown in Table 2 ). While the results obtained mostly show a moderate ACC and AUC, they still provide some validity evidence supporting the role of SVM as an effective binary classifier in the educational context. However, while these differences are noteworthy, the similarities among students in different disciplines also deserve attention. The results reported above indicate that in some disciplines, the classification capacity is not relatively high; this was the case for the model based on the SA-SP disciplines.

Regarding low differentiability, one explanation might be the indistinct classification of some emerging “soft disciplines.” It was noted that psychology, for example, could be identified as “a discipline that can be considered predominantly ‘soft’ and slightly ‘purer’ than ‘applied’ in nature” [ 111 ] (p. 43–53), which could have blurred the line between the SA and SP disciplines. As there is now no impassable gulf separating the SA and SP disciplines, their disciplinary differences may have diminished in the common practice of lecturing in classrooms. Another reason comes from the different cultivation models of “soft disciplines” and “hard disciplines” for sample students. In their high school, sample students are generally divided into liberal art students and science students and are then trained in different environments of knowledge impartation. The two-year unrelenting and intensive training makes it possible for liberal art students to develop a similar thinking and cognitive pattern that is persistent. After the college entrance examination, most liberal art students select SA or SP majors. However, a year or more of study in university does not exert strong effects on their learning styles, which explains why a multitude of researchers have traditionally investigated the SA and SP disciplines together, calling them simply “social science” or “soft disciplines” compared with “natural science” or “hard disciplines”. There have been numerous contributions pointing out similarities in the learning styles of students from “soft disciplines” [ 37 , 112 – 114 ]. However, students majoring in natural science exhibit considerable differences in learning styles, demonstrating that the talent cultivation model of “hard disciplines” in universities is to some extent more influential on students’ learning styles than that of the “soft disciplines”. Further compelling interpretations of this phenomenon await only the development of a sufficient level of accumulated knowledge among scholars in this area.

In general, these results are consistent with those reported in many previous studies based on the Felder-Silverman model. These studies tested the precision of different computational approaches in identifying and differentiating the learning styles of students. For example, by means of a Bayesian network (BN), an investigation obtained an overall precision of 58.00% in the active/reflective dimension, 77.00% in the sensing/intuitive dimension and 63.00% in the sequential/global dimension (the visual/verbal dimension was not considered) [ 81 ]. With the help of the keyword attributes of learning objects selected by students, a precision of 70.00% in the active/reflective dimension, 73.30% in the sensing/intuitive dimension, 73.30% in the sequential/global dimension and 53.30% in the visual/verbal dimension was obtained [ 115 ].

These results add to a growing body of evidence expanding the scope of the application of the SVM algorithm. Currently, the applications of the SVM algorithm still reside largely in engineering or other hard disciplines despite some tentative trials in the humanities and social sciences [ 26 ]. In addition, as cross-disciplines increase in current higher education, it is essential to match the tailored learning styles of students and researchers studying interdisciplinary subjects, such as the HA, HP, SA and SP disciplines. Therefore, the current study is the first to incorporate such a machine learning algorithm into interdisciplinary blended learning and has broader relevance to further learning style-related theoretical or empirical investigations.

Verification of the features included in the optimal feature sets

Features included in the optimal feature sets provided mixed findings compared with previous studies. Some of the 20 identified features are verified and consistent with previous studies. A close examination of the individual questions included in the feature sets can offer some useful insights into the underlying psychological processes. For example, in six of the eight models constructed, Question 1 (“I understand something better after I try it out/think it through”) appears as the feature with the number 1 ranking, highlighting the great importance attached to this question. This question mainly reflects the dichotomy between experimentation and introspection. A possible revelation is that students across disciplines dramatically differ in how they process tasks, with the possible exception of the SA-SP disciplines. This difference has been supported by many previous studies. For example, it was found that technical students tended to be more tactile than those in the social sciences [ 116 ], and engineering students (known as HA in this study) were more inclined toward concrete and pragmatic learning styles [ 117 ]. Similarly, it was explored that engineering students prefer “a logical learning style over visual, verbal, aural, physical or solitary learning styles” [ 37 ] (p. 122), while social sciences (known as SA in this study) students prefer a social learning style to a logical learning style. Although these studies differ in their focus to a certain degree, they provide an approximate idea of the potential differences among students in their relative disciplines. In general, students in the applied disciplines show a tendency to experiment with tasks, while those in the pure disciplines are more inclined towards introspective practices, such as an obsession with theories. For instance, in Biglan’s taxonomy of academic disciplines, students in HP disciplines prefer abstract rules and theories, while students in SA disciplines favor application [ 67 ]. Additionally, Question 10 (“I find it easier to learn facts/to learn concepts”) is similar to Question 1, as both questions indicate a certain level of abstraction or concreteness. The difference between facts and concepts is closely related to the classification difference between declarative knowledge and procedural knowledge in cognitive psychology [ 35 , 38 ]. Declarative knowledge is static and similar to facts, while procedural knowledge is more dynamic and primarily concerned with operational steps. Students’ preferences for facts or concepts closely correspond to this psychological distinction.

In addition, Questions 2, 4, 7, and 9 also occur frequently in the 20 features selected for the different models. Question 2 (“I would rather be considered realistic/innovative”) concerns taking chances. This question reflects a difference in perspective, i.e., whether the focus should be on obtaining pragmatic results or seeking original solutions. This difference cannot be easily connected to the disciplinary factor. Instead, there are numerous factors, e.g., genetic, social and psychological factors, that may play a strong role in defining this trait. The academic discipline only serves to strengthen or diminish this difference. For instance, decades of research in psychology have shown that males are more inclined towards risk taking than females [ 118 – 121 ]. A careful examination of the current academic landscape reveals a gender difference; more females choose soft disciplines than males, and more males choose hard disciplines than females. This situation builds a disciplinary wall classifying students into specific categories, potentially strengthening the disciplinary effect. For example, Question 9 (“In a study group working on difficult material, I am more likely to jump in and contribute ideas/sit back and listen”) emphasizes the distinction between active participation and introspective thinking, reflecting an underlying psychological propensity in blended learning. Within this context, the significance of this question could also be explained by the psychological evaluation of “loss and gain”, as students’ different learning styles are associated with expected reward values and their internal motivational drives, which are determined by their personality traits [ 122 ]. When faced with the risk of “losing face”, whether students will express their ideas in front of a group of people depends largely on their risk and stress management capabilities and the presence of an appropriate motivation system.

The other two questions also convey similar messages regarding personality differences. Question 4 concerns how individuals perceive the world, while Question 7 concerns the preferred modality of information processing. Evidence of disciplinary differences in these respects was also reported [ 35 , 123 – 125 ]. The other questions, such as Questions 21, 27, and 39, show different aspects of potential personality differences and are mostly consistent with the previous discussion. This might also be a vivid reflection of the multi-faceted effects of blended learning, which may differ in their consonance with the features of each discipline. First, teachers from different domains use technology in different ways, and student from different disciplines may view blended learning differently. For instance, the characteristics of soft-applied fields entail specialized customization in blended courses, further broadening the gulf between different subjects [ 126 ]. Second, although blended learning is generally recognized as a stimulus to students’ innovation [ 127 ], some students who are used to an instructivist approach in which the educator acts as a ‘sage on the stage’ will find it difficult to adapt to a social constructivist approach in which the educator serves as a ‘guide on the side’ [ 128 ]. This difficulty might not only negatively affect students’ academic performance but also latently magnify the effects of different academic disciplines.

Interpretation of the collective impact of optimal feature sets

In each SVM model based on a two-discipline model, the 20 key features (collectively known as an optimal feature set) selected exert a concerted effect on students’ learning styles across different disciplines (see Table 2 ). A broad examination of the distribution of collective impact of each feature set with 20 features in the eight discipline models suggests that it is especially imperative considering the emerging cross-disciplines in academia. Current higher education often involves courses with crossed disciplines and students with diverse disciplinary backgrounds. In addition, with the rise of technology-enhanced learning, the design of personalized tutoring systems requires more nuanced information related to student attributes to provide greater adaptability [ 59 ]. By identifying these optimal feature sets, such information becomes accessible. Therefore, understanding such interdisciplinary factors and designing tailor-made instructions are essential for promoting learning success [ 9 ]. For example, in an English language classroom in which the students are a blend of HP and SP disciplines, instructors might consider integrating a guiding framework at the beginning of the course and stepwise guidelines during the process such that the needs of both groups are met. With the knowledge that visual style is dominant across disciplines, instructors might include more graphic presentations (e.g., Question 11) in language classrooms rather than continue to use slides or boards filled with words. Furthermore, to achieve effective communication with students and deliver effective teaching, instructors may target these students’ combined learning styles. While some methods are already practiced in real life, this study acts as a further reminder of the rationale underlying these practices and thus increases the confidence of both learners and teachers regarding these practices. Therefore, the practical implications of this study mainly concern classroom teachers and educational researchers, who may draw some inspiration for interdisciplinary curriculum design and the tailored application of learning styles to the instructional process.

Conclusions

This study investigated learning style differences among students with diverse disciplinary backgrounds in a blended English language course based on the Felder-Silverman model. By introducing a novel machine learning algorithm, namely, SVM, for the data analysis, the following conclusions can be reached. First, the multiple performance indicators used in this study confirm that it is feasible to apply learning styles to differentiate various disciplines in students’ blended learning processes. These disciplinary differences impact how students engage in their blended learning activities and affect students’ ultimate blended learning success. Second, some questions in the ILS questionnaire carry more substantive information about students’ learning styles than other questions, and certain underlying psychological processes can be derived. These psychological processes reflect students’ discipline-specific epistemologies and represent the possible interaction between the disciplinary background and learning style. In addition, the introduction of SVM in this study can provide inspiration for future studies of a similar type along with the theoretical significance of the above findings.

Despite the notable findings of this study, it is subject to some limitations that may be perfected in further research. First, the current analysis examined the learning styles without allowing for the effects of other personal or contextual factors. The educational productivity model proposed by Walberg underlines the significance of the collected influence of contextual factors on individuals’ learning [ 129 ]. For example, teachers from different backgrounds and academic disciplines are inclined to select various teaching methods and to create divergent learning environments [ 130 ], which should also be investigated thoroughly. The next step is therefore to take into account the effects of educational background, experience, personality and learning experience to gain a more comprehensive understanding of students’ learning process in the blended setting.

In conclusion, the findings of this research validate previous findings and offer new perspectives on students’ learning styles in a blended learning environment, which provides future implications for educational researchers, policy makers and educational practitioners (i.e., teachers and students). For educational researchers, this study not only highlights the merits of using machine learning algorithms to explore students’ learning styles but also provides valuable information on the delicate interactions between blended learning, academic disciplines and learning styles. For policy makers, this analysis provides evidence for a more inclusive but personalized educational policy. For instance, in addition to learning styles, the linkage among students’ education in different phases should be considered. For educational practitioners, this study plays a positive role in promoting student-centered and tailor-made teaching. The findings of this study can help learners of different disciplines develop a more profound understanding of their blended learning tendencies and assist teachers in determining how to bring students’ learning styles into full play pedagogically, especially in interdisciplinary courses [ 131 – 134 ].

Supporting information

https://doi.org/10.1371/journal.pone.0251545.s001

S2 File. Informed consent for participants.

https://doi.org/10.1371/journal.pone.0251545.s002

S1 Dataset.

https://doi.org/10.1371/journal.pone.0251545.s003

Acknowledgments

The authors would like to thank the anonymous reviewers for their constructive comments on this paper and Miss Ying Zhou for her suggestions during the revision on this paper.

  • View Article
  • Google Scholar
  • 15. Dunn R, Dunn K, Perrin J. Teaching young children through their individual learning styles. Boston, MA: Allyn & Bacon, Inc;1994.
  • 17. Curry L. Integrating concepts of cognitive or learning style: A review with attention to psychometric standards. Ottawa, ON: Canadian College of Health Service Executives;1987.
  • 18. Kolb DA. Experiential learning: experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall;1984.
  • 21. Anitha D, Deisy C, Lakshmi SB, Meenakshi MK. Proposing a Classification Methodology to Reduce Learning Style Combinations for Better Teaching and Learning. 2014 IEEE Sixth International Conference on Technology for Education, Amritapuri, India, 2014; 208–211. https://doi.org/10.1109/T4E.2014.5
  • 23. Kuljis J, Liu F. A comparison of learning style theories on the suitability for e-learning. Web Technologies, Applications, and Services, 2005; 191–197. Retrieved from: https://www.mendeley.com/catalogue/da014340-bfb5-32d1-b144-73545a86d440/
  • 30. Richardson JTE. Researching student learning. Buckingham: SRHE and Open University Press; 2000.
  • 33. Marton F, Säljö R. Approaches to learning. Edinburgh: Scottish Academic Press; 1984.
  • PubMed/NCBI
  • 52. Thorne K. Blended learning: How to integrate online and traditional learning, London: Kogan Page; 2003.
  • 59. Graf S. Adaptivity in learning management systems focusing on learning styles. Vienna, Austria: Vienna University of Technology;2007.
  • 79. Crockett K, Latham A, Mclean D, Bandar Z, O’Shea J. On predicting learning styles in conversational intelligent tutoring systems using fuzzy classification trees. IEEE International Conference on Fuzzy Systems. 2011; 2481–2488. https://doi.org/10.1109/FUZZY.2011.6007514
  • 82. Coffield, Ecclestone K, Moseley, Hall E. Learning styles and pedagogy in post 16 education: A critical and systematic review. London, UK: Learning and Skills Research Centre;2004.
  • 86. Acuna E, Rodriguez C. The Treatment of Missing Values and its Effect on Classifier Accuracy. In Banks D, House L, McMorris FR, Arabie P, Gaul W, editors. Classification, Clustering, and Data Mining Applications. Springer Berlin Heidelberg;2004. p. 639–647.
  • 88. Boser BE, Guyon IM, Vapnik VN. A training algorithm for optimal margin classifiers. A training algorithm for optimal margin classifiers. Proceedings of The Fifth Annual Workshop on Computational Learning Theory. New York: ACM Press, 1992: 144–152. https://doi.org/10.1145/130385.130401
  • 100. Maloof MA. Learning when data sets are imbalanced and when costs are unequal and unknown. Proceedings of the 20th International Conference on Machine Learning (ICML-2003). 2003. Retrieved from: http://www.site.uottawa.ca/~nat/Workshop2003/maloof-icml03-wids.pdf
  • 101. Yan L, Rodier R, Mozer M, Wolniewicz R. Optimizing classifier performance via the Wilcoxon-Mann-Withney statistics. Proceedings of the 20th International Conference on Machine Learning (ICML-2003). 2003. Retrieved from: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.324.1091&rep=rep1&type=pdf
  • 105. James G, Witten D, Hastie T, Tibshirani R. An introduction to statistical learning: with applications in R (Springer Texts in Statistics). 1st ed. New York, NY: Springer Verlag;2013.
  • 118. Anderson JR. Language, memory, and thought. Mahwah, NJ: Lawrence Erlbaum Associates;1976.
  • 119. Anderson JR. The architecture of cognition. Cambridge, Massachusetts: Harvard University Press;1983.

Constructivist Learning Theory and Creating Effective Learning Environments

  • First Online: 30 October 2021

Cite this chapter

recent research on learning styles

  • Joseph Zajda 36  

Part of the book series: Globalisation, Comparative Education and Policy Research ((GCEP,volume 25))

3702 Accesses

9 Citations

This chapter analyses constructivism and the use of constructivist learning theory in schools, in order to create effective learning environments for all students. It discusses various conceptual approaches to constructivist pedagogy. The key idea of constructivism is that meaningful knowledge and critical thinking are actively constructed, in a cognitive, cultural, emotional, and social sense, and that individual learning is an active process, involving engagement and participation in the classroom. This idea is most relevant to the process of creating effective learning environments in schools globally. It is argued that the effectiveness of constructivist learning and teaching is dependent on students’ characteristics, cognitive, social and emotional development, individual differences, cultural diversity, motivational atmosphere and teachers’ classroom strategies, school’s location, and the quality of teachers. The chapter offers some insights as to why and how constructivist learning theory and constructivist pedagogy could be useful in supporting other popular and effective approaches to improve learning, performance, standards and teaching. Suggestions are made on how to apply constructivist learning theory and how to develop constructivist pedagogy, with a range of effective strategies for enhancing meaningful learning and critical thinking in the classroom, and improving academic standards.

The unexamined life is not worth living (Socrates, 399 BCE).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Abdullah Alwaqassi, S. (2017). The use of multisensory in schools . Indiana University. https://scholarworks.iu.edu/dspace/bitstream/handle/2022/21663/Master%20Thesis%20in%

Acton, G., & Schroeder, D. (2001). Sensory discrimination as related to general intelligence. Intelligence, 29 , 263–271.

Google Scholar  

Adak, S. (2017). Effectiveness of constructivist approach on academic achievement in science at secondary level. Educational Research Review, 12 (22), 1074–1079.

Adler, E. (1997). Seizing the middle ground: Constructivism in world politics. European Journal of International Relations, 3 , 319–363.

Akpan, J., & Beard, B. (2016). Using constructivist teaching strategies to enhance academic outcomes of students with special needs. Journal of Educational Research , 4 (2), 392–398. Retrieved from https://files.eric.ed.gov/fulltext/EJ1089692.pdf

Al Sayyed Obaid, M. (2013). The impact of using multi-sensory approach for teaching students with learning disabilities. Journal of International Education Research , 9 (1), 75–82. Retrieved from https://eric.ed.gov/?id=EJ1010855

Alt, D. (2017).Constructivist learning and openness to diversity and challenge in higher education environments. Learning Environments Research, 20 , 99–119. Retrieved from https://doi.org/10.1007/s10984-016-9223-8

Arends, R. (1998). Learning to teach . Boston: McGraw Hill.

Ayaz, M. F., & Şekerci, H. (2015). The effects of the constructivist learning approach on student’s academic achievement: A meta-analysis study . Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.1072.4600&rep=rep1&type=pdf

Barlett, F. (1932), Remembering: A study in experimental and social psychology. Cambridge: CUP.

Bandura, A. (1977). Social learning theory . New York: General Learning Press.

Beck, C., & Kosnik, C. (2006). Innovations in teacher education: A social constructivist approach . New York, NY: SUNY Press.

Black, A., & Ammon, P. (1992). A developmental-constructivist approach to teacher education. Journal of Teacher Education, 43 (5), 323–335.

Bowles, S., & Gintis, H. (1976). Schooling in capitalist America . London: Routledge & Kegan Paul.

Brooks, J., & Brooks, M. (1993). In search of understanding: The case for constructivist classrooms . Alexandria, VA: Association of Supervision and Curriculum Development.

Bruner, J. (1963). The process of education . New York: Vintage Books.

Bynum, W. F., & Porter, R. (Eds.). (2005). Oxford dictionary of scientific quotations . Oxford: Oxford University Press.

Carnoy, M. (1999). Globalization and education reforms: What planners need to know . Paris: UNESCO, International Institute for Educational Planning.

Crogman, H., & Trebeau Crogman, M. (2016). Generated questions learning model (GQLM): Beyond learning styles . Retrieved from https://www.cogentoa.com/article/10.1080/2331186X.2016.1202460

Dangel, J. R. (2011). An analysis of research on constructivist teacher education . Retreived from https://ineducation.ca/ineducation/article/view/85/361

Dewey, J. (1938). Experience and education . New York: Collier Books.

Doll, W. (1993). A post-modem perspective on curriculum . New York: Teachers College Press.

Doolittle, P. E., & Hicks, D. E. (2003). Constructivism as a theoretical foundation for the use of technology in social studies. Theory and Research in Social Education, 31 (1), 71–103.

Dunn, R., & Smith, J. B. (1990). Chapter four: Learning styles and library media programs. In J. B. Smith (Ed.), School library media annual (pp. 32–49). Englewood, CO: Libraries Unlimited.

Dunn, R., et al. (2009). Impact of learning-style instructional strategies on students’ achievement and attitudes: Perceptions of educators in diverse institutions. The Clearing House, 82 (3), 135–140. Retrieved from http://www.jstor.org/stable/30181095

Fontana, D. (1995). Psychology for teachers . London: Palgrave Macmillan.

Fosnot, C. T. (Ed.). (1989). Constructivism: Theory, perspectives, and practice . New York: Teacher's College Press.

Fosnot, C. T., & Perry, R. S. (2005). Constructivism: A psychological theory of learning. In C. T. Fosnot (Ed.), Constructivism: Theory, perspectives, and practice . New York: Teacher’s College Press.

Gardner, H. (1983). Frames of mind: The theory of multiple intelligences . New York: Basic Books.

Gardner, H. (1999). Intelligence reframed: Multiple intelligences for the 21st century . New York: Basic Books.

Gredler, M. E. (1997). Learning and instruction: Theory into practice (3rd ed.). Upper Saddle River, NJ: Prentice-Hall.

Gupta, N., & Tyagi, H. K. (2017). Constructivist based pedagogy for academic improvement at elementary level . Retrieved from https://www.researchgate.net/publication/321018062_constructivist_based_pedagogy_for_academic_improvement_at_elementary_level

Guzzini, S. (2000). A reconstruction of constructivism in international relations. European Journal of International Relations, 6 , 147–182.

Hirtle, J. (1996). Social constructivism . English Journal, 85 (1), 91. Retrieved from. https://search.proquest.com/docview/237276544?accountid=8194

Howe, K., & Berv, J. (2000). Constructing constructivism, epistemological and pedagogical. In D. C. Phillips (Ed.), Constructivism in education (pp. 19–40). Illinois: The National Society for the Study of Education.

Hunter, W. (2015). Teaching for engagement: part 1: Constructivist principles, case-based teaching, and active learning . Retrieved from https://www.researchgate.net/publication/301950392_Teaching_for_Engagement_Part_1_Co

Jonassen, D. H. (1994). Thinking technology. Educational Technology, 34 (4), 34–37.

Jonassen, D. H. (2000). Revisiting activity theory as a framework for designing student-centered learning environments. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments (pp. 89–121). Mahwah, NJ: Lawrence Erlbaum.

Kelly, G. A. (1955/1991). The psychology of personal constructs . Norton (Reprinted by Routledge, London, 1991).

Kharb, P. et al. (2013). The learning styles and the preferred teaching—Learning strategies of first year medical students . Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3708205/

Kim, B. (2001). Social constructivism. In M. Orey (Ed.), Emerging perspectives on learning, teaching, and technology . http://www.coe.uga.edu/epltt/SocialConstructivism.htm

Kim, J. S. (2005). The effects of a constructivist teaching approach on student academic achievement, self-concept, and learning strategies. Asia Pacific Education Review, 6 (1), 7–19.

Kolb, D. A., & Fry, R. (1975). Toward an applied theory of experiential learning. In C. Cooper (Ed.), Theories of group process . London: John Wiley.

Kukla, A. (2000). Social constructivism and the philosophy of science . London: Routledge.

Mahn, H., & John-Steiner, V. (2012). Vygotsky and sociocultural approaches to teaching and learning . https://doi.org/10.1002/9781118133880.hop207006

Martin, J., & Sugarman, J. (1999). The psychology of human possibility and constraint . Albany: SUNY.

Matthews, M. (2000). Constructivism in science and mathematics education. In C. Phillips (Ed.), Constructivism in education, ninety-ninth yearbook of the national society for the study of education, Part 1 (pp. 159–192). Chicago: University of Chicago Press.

Maypole, J., & Davies, T. (2001). Students’ perceptions of constructivist learning in a Community College American History 11 Survey Course . Retrieved from https://doi.org/10.1177/009155210102900205

McInerney, D. M., & McInerney, V. (2018). Educational psychology: Constructing learning (5th ed.). Sydney: Pearson.

McLeod, S. (2019). Constructivism as a theory for teaching and learning . Retrieved from https://www.simplypsychology.org/constructivism.html

OECD. (2007). Equity and quality in education . Paris: OECD.

OECD. (2009a). Key factors in developing effective learning environments: Classroom disciplinary climate and teachers’ self-efficacy. In Creating effective teaching and learning environments . Paris: OECD.

OECD. (2009b). Education at a glance . Paris: OECD.

OECD. (2009c). The schooling for tomorrow . Paris: OECD, Centre for Educational Research and Innovation.

OECD. (2013). Synergies for better learning: An international perspective on evaluation & assessment . Retrieved from // www.oecd.org/edu/school/Evaluation_and_Assessment_Synthesis_Report.pdf

OECD. (2019a). PISA 2018 results (volume III): What school life means for students’ lives . Paris: OECD.

Oldfather, P., West, J., White, J., & Wilmarth, J. (1999). Learning through children’s eyes: Social constructivism and the desire to learn . Washington, DC: American Psychological Association.

O’Loughin, M. (1992). Rethinking science education: Beyond Piagetian constructivism toward a sociocultural model of teaching and learning. Journal of Research in Science Teaching, 2 (8), 791–820.

Onuf, N. (2003). Parsing personal identity: Self, other, agent. In F. Debrix (Ed.), Language, agency and politics in a constructed world (pp. 26–49). Armonk, NY: M.E. Sharpe.

Onuf, N. G. (2013). World of our making . Abingdon, UK: Routledge.

Packer, M., & Goicoechea, J. (2000). Sociocultural and constructivist theories of learning: Ontology, not just epistemology. Educational Psychologist, 35 (4), 227–241.

Phillips, D. (2000). An opinionated account of the constructivist landscape. In D. C. Phillips (Ed.), Constructivism in education, Ninety-ninth yearbook of the national society for the study of education, Part 1 (pp. 1–16). Chicago: University of Chicago Press.

Piaget, J. (1936). Origins of intelligence in the child . London: Routledge & Kegan Paul.

Piaget, J. (1967). Biologie et connaissance (Biology and knowledge). Gallimard.

Piaget, J. (1972). The principles of genetic epistemology (W. Mays, Trans.). Basic Books.

Piaget, J. (1977). The development of thought: Equilibration of cognitive structures . (A. Rosin, Trans.). The Viking Press.

Postman, N., & Weingartner, C. S. (1971). Teaching as a subversive activity . Harmondsworth: Penguin Books.

Puacharearn, P. (2004). The effectiveness of constructivist teaching on improving learning environments in thai secondary school science classrooms . Doctor of Science Education thesis. Curtin University of Technology. Retrieved from https://espace.curtin.edu.au/bitstream/handle/20.500.11937/2329/14877_

Richardson, V. (2003). Constructivist pedagogy. Teachers College Record, 105 (9), 1623–1640.

Sadler-Smith, E. (2001). The relationship between learning style and cognitive style. Personality and Individual Differences, 30 (4), 609–616.

Searle, J. R. (1995). The construction of social reality . New York, NY: Penguin Books.

Shah, R. K. (2019). Effective constructivist teaching learning in the classroom . Retrieved from https://files.eric.ed.gov/fulltext/ED598340.pdf

Sharma, H. L., & Sharma, L. (2012). Effect of constructivist approach on academic achievement of seventh grade learners in Mathematics. International Journal of Scientific Research, 2 (10), 1–2.

Shively, J. (2015). Constructivism in music education. Arts Education Policy Review: Constructivism, Policy, and Arts Education, 116 (3), 128–136.

Shor, I. (1992). Empowering education: Critical teaching for social change . Chicago: University of Chicago Press.

Slavin, R. (1984). Effective classrooms, effective schools: A research base for reform in Latin American education . Retrieved from http://www.educoas.org/Portal/bdigital/contenido/interamer/BkIACD/Interamer/

Slavin, R. E. (2020). Education psychology: theory and practice (12th ed.). Pearson.

Steffe, L., & Gale, J. (Eds.). (1995). Constructivism in education . Hillsdale, NJ.: Erlbaum.

Stoffers, M. (2011). Using a multi-sensory teaching approach to impact learning and community in a second-grade classroom . Retrieved from https://rdw.rowan.edu/cgi/viewcontent.cgi?article=1109&context=etd

Thomas, A., Menon, A., Boruff, J., et al. (2014). Applications of social constructivist learning theories in knowledge translation for healthcare professionals: A scoping review. Implementation Science, 9 , 54. https://doi.org/10.1186/1748-5908-9-54 .

Article   Google Scholar  

Thompson, P. (2000). Radical constructivism: Reflections and directions. In L. P. Steffe & P. W. Thompson (Eds.), Radical constructivism in action: Building on the pioneering work of Ernst von Glasersfeld (pp. 412–448). London: Falmer Press.

von Glaserfeld, E. (1995). Radical constructivism: A way of knowing and learning . London: The Falmer Press.

Vygotsky, L. S. (1934a). Myshlenie i rech (Thought and language). State Socio-Economic Publishing House (translated in 1986 by Alex Kozulin, MIT).

Vygotsky. (1934b). Thought and language . Cambridge, Mass.: The MIT Press.

Vygotsky, L. (1968). The psychology of art . Moscow: Art.

Vygotsky, L. (1973). Thought and language (A. Kozulin, Trans. and Ed.). The MIT Press. (Originally published in Russian in 1934.)

Vygotsky, L. S. (1978). In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in society: The development of higher psychological processes . Cambridge, MA: Harvard University Press.

Watson, J. (2003). Social constructivism in the classroom . Retrieved from https://onlinelibrary.wiley.com/doi/abs/10.1111/1467-9604.00206

Wertsch, J. V. (1991). Voices of the mind: A sociocultural approach to mediated action . Cambridge, MA: Harvard University Press.

Zajda, J. (Ed.). (2008a). Learning and teaching (2nd ed.). Melbourne: James Nicholas Publishers.

Zajda, J. (2008b). Aptitude. In G. McCulloch & D. Crook (Eds.), The international encyclopedia of education . London: Routledge.

Zajda, J. (2008c). Globalisation, education and social stratification. In J. Zajda, B. Biraimah, & W. Gaudelli (Eds.), Education and social inequality in the global culture (pp. 1–15). Dordrecht: Springer.

Zajda, J. (2018a). Motivation in the classroom: Creating effective learning environments. Educational Practice & Theory, 40 (2), 85–103.

Zajda, J. (2018b). Effective constructivist pedagogy for quality learning in schools. Educational Practice & Theory, 40 (1), 67–80.

Zajda, J. (Ed.). (2020a). Globalisation, ideology and education reforms: Emerging paradigms . Dordrecht: Springer.

Zajda, J. (Ed.). (2021). 3rd international handbook of globalisation, education and policy research . Dordrecht: Springer.

Zajda, J., & Majhanovich, S. (Eds.). (2021). Globalisation, cultural identity and nation-building: The changing paradigms . Dordrecht: Springer.

Zaphir, L. (2019). Can schools implement constructivism as an education philosophy? Retrieved from https://www.studyinternational.com/news/constructivism-education/

Download references

Author information

Authors and affiliations.

Faculty of Education & Arts, School of Education, Australian Catholic University, East Melbourne, VIC, Australia

Joseph Zajda

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Zajda, J. (2021). Constructivist Learning Theory and Creating Effective Learning Environments. In: Globalisation and Education Reforms. Globalisation, Comparative Education and Policy Research, vol 25. Springer, Cham. https://doi.org/10.1007/978-3-030-71575-5_3

Download citation

DOI : https://doi.org/10.1007/978-3-030-71575-5_3

Published : 30 October 2021

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-71574-8

Online ISBN : 978-3-030-71575-5

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

Evidence-Based Higher Education – Is the Learning Styles ‘Myth’ Important?

Associated data.

The basic idea behind the use of ‘Learning Styles’ is that learners can be categorized into one or more ‘styles’ (e.g., Visual, Auditory, Converger) and that teaching students according to their style will result in improved learning. This idea has been repeatedly tested and there is currently no evidence to support it. Despite this, belief in the use of Learning Styles appears to be widespread amongst schoolteachers and persists in the research literature. This mismatch between evidence and practice has provoked controversy, and some have labeled Learning Styles a ‘myth.’ In this study, we used a survey of academics in UK Higher Education ( n = 114) to try and go beyond the controversy by quantifying belief and, crucially, actual use of Learning Styles. We also attempted to understand how academics view the potential harms associated with the use of Learning Styles. We found that general belief in the use of Learning Styles was high (58%), but lower than in similar previous studies, continuing an overall downward trend in recent years. Critically the percentage of respondents who reported actually using Learning Styles (33%) was much lower than those who reported believing in their use. Far more reported using a number of techniques that are demonstrably evidence-based. Academics agreed with all the posited weaknesses and harms of Learning Styles theory, agreeing most strongly that the basic theory of Learning Styles is conceptually flawed. However, a substantial number of participants (32%) stated that they would continue to use Learning Styles despite being presented with the lack of an evidence base to support them, suggesting that ‘debunking’ Learning Styles may not be effective. We argue that the interests of all may be better served by promoting evidence-based approaches to Higher Education.

Introduction

The use of so-called ‘Learning Styles’ in education has caused controversy. The basis for the use of Learning Styles is that individual difference between learners can supposedly be captured by diagnostic instruments which classify learners into ‘styles’ such as ‘visual,’ ‘kinaesthetic,’ ‘assimilator,’ etc. According to many, but not all, interpretations of Learning Styles theory, to teach individuals using methods which are matched to their ‘Learning Style’ will result in improved learning ( Pashler et al., 2008 ). This interpretation is fairly straightforward to test, and, although there are over 70 different instruments for classifying Learning Styles ( Coffield et al., 2004 ) the current status of the literature is that there is no evidence to support the use of Learning Styles in this way ( Pashler et al., 2008 ; Rohrer and Pashler, 2012 ). This has lead to Learning Styles being widely classified as a ‘myth’ ( Geake, 2008 ; Riener and Willingham, 2010 ; Lilienfeld et al., 2011 ; Dekker et al., 2012 ; Pasquinelli, 2012 ; Rato et al., 2013 ; Howard-Jones, 2014 ).

Despite this lack of evidence, it appears that belief in the use of Learning Styles is common amongst schoolteachers – A 2012 study demonstrated that 93% of schoolteachers in the UK agree with the statement “Individuals learn better when they receive information in their preferred Learning Style (e.g., auditory, visual, kinaesthetic) ( Dekker et al., 2012 ).” A 2014 survey reported that 76% of UK schoolteachers ‘used Learning Styles’ and most stated that to do so benefited their pupils in some way ( Simmonds, 2014 ). A study of Higher Education faculty in the USA showed that 64% agreed with the statement “Does teaching to a student’s learning style enhance learning?” ( Dandy and Bendersky, 2014 ). A recent study demonstrated that current research papers ‘about’ Learning Styles, in the higher education research literature, overwhelmingly endorsed their use despite the lack of evidence described above ( Newton, 2015 ). Most of this endorsement was implicit and most of the research did not actually test Learning Styles, rather proceeded on the assumption that their use was a ‘good thing.’ For example, researchers would ask a group of students to complete a Learning Styles questionnaire, and then make recommendations for curriculum reform based upon the results.

This mismatch between the empirical evidence and belief in Learning Styles, alongside the persistence of Learning Styles in the wider literature, has lead to tension and controversy. There have been numerous publications in the mainstream media attempting to explain the limitations of Learning Styles (e.g., Singal, 2015 ; Goldhill, 2016 ) and rebuttals from practitioners who believe that the theory of Learning Styles continues to offer something useful and/or that criticism of them is invalid (e.g., Black, 2016 ). Some of the original proponents of the concept have self-published their own defense of Learning Styles, e.g., ( Felder, 2010 ; Fleming, 2012 ).

The continued use of Learning Styles is, in theory, associated with a number of harms ( Pashler et al., 2008 ; Riener and Willingham, 2010 ; Dekker et al., 2012 ; Rohrer and Pashler, 2012 ; Dandy and Bendersky, 2014 ; Willingham et al., 2015 ). These include a ‘pigeonholing’ of learners according to invalid criteria, for example a ‘visual learner’ may be dissuaded from pursuing subjects which do not appear to match their diagnosed Learning Style (e.g., learning music), and/or may become overconfident in their ability to master subjects perceived as matching their Learning Style. Other proposed harms include wasting resources on an ineffective method, undermining the credibility of education research/practice and the creation of unrealistic expectations of teachers by students.

This study aimed at asking first whether academics in UK Higher Education also believe in Learning Styles. We then attempted to go beyond the controversy and ask whether academics actually use Learning Styles, and how seriously they rate the proposed harms associated with the use of Learning Styles, with the aim of understanding how best to address the persistence of Learning Styles in education. In addition, we compared belief in/use of Learning Styles to some educational techniques whose use is supported by good research evidence, to put the use of, and belief in, Learning Styles into context.

We found that belief in the use of Learning Styles was high (58% of participants), but that actual use of Learning Styles was much lower (33%) and lower than other techniques which are demonstrably effective. The most compelling weakness/harm associated with Learning Styles was a simple theoretical weakness; 90% of participants agreed that Learning Styles are conceptually flawed.

Materials and Methods

Data were collected using an online questionnaire distributed to Higher Education institutions in the UK. Ethical approval for the study was given by the local Research Ethics Committee at Swansea University with informed consent from all subjects.

Participants

The survey was distributed via email. Distribution was undertaken indirectly; emails were sent to individuals at eight different Higher Education institutions across the UK. Those persons were known to the corresponding author as colleagues in Higher Education but not through work related to Learning Styles. Those individuals were asked to send the survey on to internal email distribution lists of academics involved in Higher Education using the following invitation text (approved by the ethics committee) “You are invited to participate in a short anonymous survey about teaching methods in Higher Education. It will take approximately 10–15 min to complete. It is aimed at academics in Higher Education,” followed by a link to the survey which was entitled “Teaching Methods in Higher Education.” Thus the survey was not directly distributed by the authors and did not contain the phrase ‘Learning Styles’ anywhere in the title or introductory text. These strategies of indirect distribution, voluntary completion and deliberately not using the term ‘Learning Styles’ in the title were based upon similar strategies used in similar studies ( Dekker et al., 2012 ; Dandy and Bendersky, 2014 ) and were aimed at avoiding biasing and/or polarizing the participant pool, given the aforementioned controversy associated with the literature on Learning Styles. Although this inevitably results in a convenience sample (we do not know how many people the survey as sent to or how many responded), this was preferable to distributing a survey that was expressly about Learning Styles (which may have put off those who are already familiar with the concept). The survey remained open for 2 months (which included the end-of-year holiday period) and was closed once we had over 100 participants who had fully completed the survey, to ensure a sample size equivalent to similar studies ( Dekker et al., 2012 ; Dandy and Bendersky, 2014 ).

One hundred sixty-one participants started the survey, with 114 completing the survey up to the final (optional) question about demographics. This meant that 29% of participants did not complete, which is slightly better than the average dropout rate of 30% for online surveys ( Galesic, 2006 ). Question-by-question analysis revealed that the majority of these non-completers (79%) did not progress beyond the very first ranking question (ranking the effectiveness of teaching methods) and thus did not complete the majority of the survey, including answering those questions about Learning Styles. Participants had been teaching in Higher Education for an average of 11 years ( SD = 9.8). Participants were asked to self-report their academic discipline. Simple coding of these revealed that participants came from a wide variety of disciplines, including Life and Physical Sciences (26%), Arts, humanities and languages (24%), Healthcare professions (medicine, nursing, pharmacy, etc.) (16%), Social Sciences (10%), Business and Law (5%).

Materials and Procedure

The lack of an evidence base for Learning Styles has been described numerous times in the literature, and these papers have suggested that there may be harms associated with the use of Learning Styles ( Pashler et al., 2008 ; Riener and Willingham, 2010 ; Dekker et al., 2012 ; Rohrer and Pashler, 2012 ; Dandy and Bendersky, 2014 ; Willingham et al., 2015 ). We reviewed these publications to identify commonly posited harms. We then constructed a questionnaire using LimeSurvey TM . All the survey questions are available via the Supplementary Material. Key aspects of the structure and design are described below. The survey was piloted by five academics from Medical and Life Sciences, all of whom were aware of the lack of evidence regarding Learning Styles. They were asked to comment on general clarity and were specifically asked to comment on the section regarding the evidence for the use of Learning Styles and whether it would disengage participants (see below). Key concepts in the survey were addressed twice, from different approaches, so as to ensure the quality of data obtained.

Participants were first asked to confirm that they were academics in Higher Education. They were then asked about their use of five teaching methods, four of which are supported by research evidence [Worked Examples, Feedback, Microteaching and Peer Teaching ( Hattie, 2009 )] and Learning Styles. They were then asked to rank these methods by efficacy.

We then asked participants about their use of Learning Styles, both generally and the use of specific classifications (VARK, Kolb, Felder, Honey and Mumford). For each of these individual Learning Styles classifications we identified, in our question, the individual styles that result (e.g., active/reflective, etc., from Felder). Thus participants were fully oriented to what was meant by ‘Learning Styles’ before we went on to ask them about the efficacy of Learning Styles. To allow comparisons with existing literature, we used the same question as Dekker et al. (2012) “Rate your agreement with this statement ‘Individuals learn better when they receive information in their preferred Learning Style (e.g., auditory, visual, kinaesthetic).”’

We then explained to participants about the lack of an evidence base for the use of Learning Styles, including the work of Coffield et al. (2004) , Pashler et al. (2008) , Rohrer and Pashler (2012) , Willingham et al. (2015) . We explained the difference between learning preferences and Learning Style, and made it clear that there was specifically no evidence to support the ‘matching’ of teaching methods to individual Learning Styles. We explained that this fact may be surprising, and that participants would be free to enter any comments they had at the end of the survey. Those academics who piloted the initial survey were specifically asked to comment on this aspect of the survey to ensure that it was neutral and objective.

We then asked participants to rate their agreement with some of the proposed harms associated with the use of Learning Styles. Mixed into the questions about harms were some proposed reasons to use Learning Styles, regardless of the evidence. These questions were interspersed so as to avoid ‘acquiescence bias’ ( Sax et al., 2003 ). Agreement was measured on a 5-point Likert scale.

Finally, participants were asked for some basic demographic information and then offered the opportunity to provide free-text comments on the content of the survey.

Quantitative data were analyzed by non-parametric methods; specific tests are described in the results. Percentages of participants agreeing, or disagreeing, with a particular statement were calculated by collapsing the two relevant statements within the Likert scale (e.g., ‘Strongly Agree and Agree’ were collapsed into a single value). Qualitative data (free-text comments) were analyzed using a simple ground-up thematic analysis ( Braun and Clarke, 2006 ) to identify common themes. Both authors independently read and re-read the comments to identify their own common themes. The authors then met and discussed these, arriving at agreed common themes and quantifying the numbers of participants who had raised comments for each theme. Many participant comments were pertinent to more than one theme.

Belief vs. Use; Do Teachers in Higher Education Actually Use Learning Styles?

We addressed this question from two perspectives. Academics were asked to identify which teaching methods, from a list of 5, they had used in the last 12 months. Results are shown in Figure ​ Figure1 1 . Thirty-three percent of participants reported having used Learning Styles in the last 12 months, but this was lower than the evidence-based techniques of formative assessment, worked examples, and peer teaching. Participants were then asked “have you ever administered a Learning Styles questionnaire to your students” and were given four specific examples along with the ‘styles’ identified by those examples. The examples chosen were those most commonly found in a recent study of the literature on Learning Styles ( Newton, 2015 ). Participants were also given the option to check ‘other’ and identify any other types of Learning Styles questionnaire that they might have used. 33.1% of participants had given their students any sort of Learning Styles Questionnaire, with the response for individual classifications being 18.5% (Honey and Mumford), 14.5% (Kolb), 12.9% (VARK), and 1.6% (Felder).

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-00444-g001.jpg

Use of various teaching methods in the last 12 months. Academics were asked which of the methods they had used in the last 12 months. Four of the methods were accompanied by a brief description: Formative Assessment (practice tests), Peer Teaching (students teaching each other), Learning Styles (matching teaching to student Learning Styles). Microteaching (peer review by educators using recorded teaching).

We subsequently asked two, more general, questions about Learning Styles. The first of these was the same as that used by Dekker et al. “Individuals learn better when they receive information in their preferred Learning Style (e.g., auditory, visual, kinaesthetic),” with which 58% agreed. The second was “I try to organize my teaching to accommodate different student Learning Styles (e.g., visual, kinaesthetic, assimilator/converger),” with which 64% of participants agreed. These data show a contrast between a general belief in the use of Learning Styles, which is much higher than actual use ( Figure ​ Figure2 2 ).

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-00444-g002.jpg

Belief in use of Learning Styles. At different points throughout the survey, participants were asked to rate their agreement with the statements regarding their belief in, and their actual use of, Learning Styles. These questions were asked prior to informing participants about the lack of evidence for the use of Learning Styles. When asked if they believed in the use of Learning Styles 1,2 , approximately two thirds of participants agreed, whereas when asked specifically about actual use 3,4 , agreement dropped to one-third.

1 Rate your agreement with this statement: Individuals learn better when they receive information in their preferred Learning Style (Individuals learn better LS) .

2 Rate your agreement with the statement: I try to organize my teaching to accommodate different Learning Styles (Accomodate LS) .

3 Have you ever administered a Learning Styles questionnaire to your students? If so, please state which one (Given students a LSQ) .

4 Which of these teaching methods have you used in the last 12 months? (Used LS in year) .

Possible Harms Associated with the Use of Learning Styles

There was significant agreement with all the proposed difficulties associated with the use of Learning Styles, as shown in Figure ​ Figure3 3 . However, compared to the other proposed harms, participants showed stronger agreement with the statement “The theory of Learning Styles is conceptually flawed” – it does not account for the complexity of ‘understanding.’ It is not possible to teach complex concepts such as mathematics or languages by presenting them in only one style. In addition, some information cannot be presented in a single style (e.g., teaching medical students to recognize heart sounds would be impossible using visual methods, whereas teaching them to recognize different skin rashes would be impossible using sounds). In this section of the survey we also included two questions that were not about proposed harms. Forty-six percent of participants agreed with the statement “Even though there is no ‘evidence base’ to support the use of Learning Styles, it is my experience that their use in my teaching benefits student learning,” while 70% agreed that “In my experience, students believe, rightly or wrongly, that they have a particular Learning Style.”

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-00444-g003.jpg

Participants were asked to rate their agreement with various difficulties that have been proposed to result from the use of Learning Styles. Participants agreed with all the proposed harms but there was a stronger agreement (compared to other options) with the idea that the use of Learning Styles is conceptually flawed. ∗ , significantly different from median of ‘3’ (1-sample Wilcoxon Signed Rank test). #, different from other statements (Kruskal–Wallis test).

Ranking of Proposed Harms

Having asked participants to rate their agreement (or not) with the various harms associated with the use of Learning Styles, we then asked participants to “Rank the aforementioned factors in terms of how compelling they are as reasons not to use Learning Styles” (1, most compelling, 6, least compelling) and to “only rank those factors which you agree with.” There is not universal agreement on the analysis of ranking data and so we analyzed these data in two simple, descriptive ways. The first was to determine how frequently each harm appeared as the top ranked reason. The second was to calculate a ranking score, such that the top ranked harm was scored 6, and the lowest ranked scored 1, and then to sum these across the participants. Both are shown in Table ​ Table1 1 . Results from both methods were similar and agreed with the prior analysis ( Figure ​ Figure3 3 ), with participants most concerned about the basic conceptual flaws associated with the use of Learning Styles, alongside a potential pigeonholing of learners into a particular style.

Ranking of proposed harms as compelling reasons not to use Learning Styles.

Continued Use of Learning Styles?

Toward the end of the questionnaire, we asked participants two question to determine whether the completion of the questionnaire had made any difference to their understanding of the evidence base for the use of Learning Styles. Participants were first asked to rate their agreement with the statement “Completing this questionnaire has helped me understand the lack of any evidence base to support the use of Learning Styles.” The 64% agreed while 9% disagreed and 27% neither agreed or disagreed.

Participants were then asked “In light of the information presented, rate your agreement with the following statement – ‘I plan to try and account for individual student Learning Styles in my teaching.”’ 31.6% agreed, 43.9% disagreed, and 23.6% neither agreed or disagreed. The results from this question were compared to those obtained before the evidence was presented, when participants were asked to rate their level of agreement with this statement “I try to organize my teaching to accommodate different student Learning Styles (e.g., visual, kinaesthetic, assimilator/converger).” The results, shown in Figure ​ Figure4 4 , show a statistically significant difference in the two sets of responses suggesting that completion of the questionnaire improved participants understanding of the lack of an evidence base for the use of Learning Styles and thus they were unlikely to continue using them. However, almost one-third of participants still agreed with the statement; they intended to continue using Learning Styles.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-00444-g004.jpg

The completion of the survey instrument associated with a change of participants views of Learning Styles. At the beginning of the study, participants were asked to rate their agreement with the statement “I try to organize my teaching to accommodate different student Learning Styles (e.g., visual, kinaesthetic, assimilator/converger),” and 64% agreed. At the end of the study, participants were asked “In light of the information presented, rate your agreement with the following statement – ‘I plan to try and account for individual student Learning Styles in my teaching,”’ and 32% agreed. ∗ , a Wilcoxon signed rank test revealed a statistically significant difference in the pattern of response ( P < 0.0001, W = -1977).

This then raised a series of interesting questions about why participants would persist in using Learning Styles despite having been presented with all the evidence showing that they are not effective (although participants were not specifically asked whether they would persist in the matching of instructional design to student Learning Style). The sample size here, although equivalent to previous studies, is modest and obviously the 32% are only a portion of that. Thus we were reluctant to undertake extensive post hoc analysis to identify relationships within the sample. However, in response to a reviewer’s suggestion we undertook a simple descriptive analysis of the profile of the 31.6% of participants who indicated that they would continue to account for Learning Styles and compare them to the 43.9% who said that they would not. When splitting the data into these two groups, we observed that almost all (94.4%) of those who said they would still use Learning Styles at the end of the survey had originally agreed with the question “I try to organize my teaching to accommodate different student Learning Styles (e.g., visual, kinaesthetic, assimilator/converger),” and no participants from that group had disagreed. In contrast, agreement was only 40% for the group that eventually said they would not use Learning Styles, while disagreement was 46%. A similar split was found for the question “Even though there is no ‘evidence base’ to support the use of Learning Styles, it is my experience that their use in my teaching benefits student learning”; for the group that would go on to say that they will still use Learning Styles, 89% agreed, while agreement was only 18% from the group that would go on to say they will not continue to use Learning Styles.

Educational Research Literature

Finally we asked participants to rate their agreement with the statement “my educational practice is informed by the education research literature.” Forty-eight percent of participants agreed with the statement. A Spearman Rank Correlation test revealed no correlation between responses on that question and on the ‘Dekker’ question “Individuals learn better when they receive information in their preferred Learning Style (e.g., auditory, visual, kinaesthetic)” r = 0.07508, P = 0.4.

Qualitative Comments

Forty-eight participants left free-text comments. The dominant common theme, raised by 23 participants was the need to use a variety of teaching methods in order to (for example) keep students engaged or to promote reflection. This theme was often stated in the context of ‘despite the evidence again showing a lack of effectiveness of Learning Styles.’ A related theme (13 participants) was that participants had a looser interpretation of ‘Learning Styles,’ for example that they referred simply to ‘styles of learning,’ while a second related theme from nine participants was they would still, despite the evidence, use Learning Styles and/or found them useful. Eight participants commented that they were aware of the lack of evidence base for the use of Learning Styles and eight participants also gave their own examples of why Learning Styles were conceptually flawed. Despite the careful piloting described above, a small number of participants (four) commented that the survey was biased against Learning Styles, while eight participants perceived some of the questions to be ‘leading.’ No specific ‘leading’ questions were identified but there was a substantial overlap between these two themes, with three of the comments about the survey being ‘biased against Learning Styles’ coming alongside, or as part of, a comment about questions being ‘leading,’ with an implied relationship between the two. An additional theme, from five participants, was thanks; for raising the issue and/or interesting content.

The first aim of this study was to determine how widespread belief in, and use of, Learning Styles is by academics in UK Higher Education. In a 2012 study, 93% of a sample of 137 UK school teachers agreed with the statement “Individuals learn better when they receive information in their preferred learning style (e.g., auditory, visual, kinesthetic).” In our sample of academics in UK Higher Education, 58% agreed with that same statement while 64% agreed with the similar, subsequent statement “I try to organize my teaching to accommodate different Learning Styles.” Thus a majority of academics in UK HE ‘believe’ in the use of Learning Styles although the figures are lower than in the 2012 study of schoolteachers. However, prior to asking these questions we asked some more direct questions about the actual use of Learning Styles instruments. Here the figures were much lower, with 33% of participants answering ‘yes’ to the statement “Have you ever administered a Learning Styles questionnaire to your students” and the same number stating that they had used ‘Learning Styles’ as a method in the last 12 months, where the method was defined as “matching teaching to individual student Learning Styles.” This value was lower than for a number of teaching methods that are evidence-based. Interestingly the most commonly used Learning Styles instrument was the Kolb Learning Styles Inventory; this is the Learning Styles classification that has been most frequently tested for evidence of such a ‘matching effect’ and where no evidence has been found ( Pashler et al., 2008 ).

The empirical evidence is clear that there is currently no evidence to support the use of Learning Styles instruments in this way ( Coffield et al., 2004 ; Pashler et al., 2008 ) and thus the fact that actual use of Learning Styles is lower than the use of demonstrably evidence-based methods could be considered reassuring, as could our finding that actual use is lower than ‘belief’ in the efficacy of Learning Styles. In addition, although we find that a majority of UK academics in Higher Education believe in the use of Learning Styles, the actual numbers observed are the lowest of any similar study. Studies examining belief in the use of Learning Styles have been carried out over the last few years in a number of different populations, and the overall trend is down, from 93% of UK schoolteachers in 2012 (Dekker), to 76% of UK schoolteachers in 2014 (Simmonds), 64% of HE academics in the US in 2014 (Dandy and Bendersky) to 58% here. There are obviously a number of caveats to consider before concluding that belief in the use of Learning Styles is declining; these studies have been conducted in different countries (US and UK), using teachers in different disciplines (school teachers and higher education). A follow-up, longitudinal study across different populations/contexts would be informative to address whether belief in the use of Learning Styles is truly declining, and to further understand whether actual use of Learning Styles is lower than ‘belief,’ as we have found here.

However, a more pessimistic interpretation of the data would be to focus on our finding that one-third of academics in UK higher education have, in the last year, used a method that was shown to be ineffective more than a decade earlier. The free-text comments give us some insight into the broader issue and perhaps a further hypothesis as to why the ‘myth’ of Learning Styles persists. The dominant theme was a stated need to use a diverse range of teaching methods. This is a separate issue to the use of Learning Styles and there was no suggestion in the survey that to not use Learning Styles was to advocate for all students to be taught the same way, and/or to use only one method of teaching. Neither of these approaches are advocated by the wider literature which seeks to ‘debunk’ Learning Styles, but it is clear from the abundance of comments on this theme that these two issues were related in the view of many of the participants. This is supported by the emergence of the related theme of ‘styles of learning rather than Learning Styles’; many participants had a looser definition of ‘Learning Styles’ than those introduced early in the survey. This finding leads us to urge caution and clarity in the continued ‘debunking’ of the ‘myth’ of Learning Styles. Learners obviously have preferences for how they learn. In addition, there is an obvious appeal to using a variety of teaching methods and in asking students to reflect on the ways in which they learn. However, these three concepts are unrelated to the (unsupported) idea that there is a benefit to learners from diagnosing their ‘Learning Style’ using one of the specific classifications ( Coffield et al., 2004 ) and attempting to match teaching to those styles. However, these concepts were clearly linked in the mind of many of our participants.

Participants agreed with many of the statements describing proposed harms or weaknesses of Learning Styles. Part of our intention here was to understand which are the most compelling of these; all have, at least, a face validity if not empirical evidence to support them. As we attempt to ‘spread the word’ about Learning Styles and promote alternate, evidence-based approaches, it is useful to know where perceived weaknesses are with Learning Styles. Thus our aim was not so much to observe absolute rates of agreement with individual harms/weaknesses (we would expect to see agreement, given that participants had just been told of the lack of evidence for Learning Styles), but to identify any differences in rates of agreement between the individual statements. There was strongest agreement with the conceptual weaknesses associated with Learning Style theory; that it is not possible to teach ‘understanding’ using a particular style, or to capture certain types of learning in all styles. Weakest agreement was with the statement that “The continued promotion of Learning Styles as a product is exploiting students and their teachers, for the financial gain of those companies which sell access to, and training in, the various Learning Style questionnaires.” The difference between the ‘conceptual weakness’ and other weaknesses/harms was statistically significant, suggesting that, where efforts are being made to ‘debunk’ the ‘myth’ of Learning Styles, then an appeal to the simple conceptual problems may be the most compelling approach. This would also seem to fit with the data described above re: ‘belief vs. use’; although it is tempting to believe that individual students have a Learning Style than can be utilized to benefit their education, the conceptual flaws inherent in the theory mean that actually putting them into practice may prove challenging.

Completion of the questionnaire, which highlighted all of the problems associated with the use of Learning Styles, was clearly associated with a group-shift in the stated likelihood that the participant group would use Learning Styles, although we must also consider that, having been presented with all the evidence that Learning Styles are not effective, it seems reasonable to assume that some participants may succumb to some form of social desirability bias, wherein participants respond in the way that they perceive the researchers desire or expect ( Nederhof, 1985 ). However, despite being presented with all the aforementioned evidence, approximately one-third of participants still agreed with the statement “In light of the information presented……‘I plan to try and account for individual student Learning Styles in my teaching.’” As described in the section “Introduction” there is an ongoing controversy, often played out via blogs and social media, about the use of Learning Styles, with some continuing to advocate for their use despite presentation of all the aforementioned evidence. It is even possible that to persist with a ‘myth debunking’ approach to Learning Styles may be counter-productive; the so-called ‘backfire effect’ describes a phenomenon wherein attempts to counter myths and misconceptions can result in a strengthening of belief in those myths. For example, 43% of the US population believe that the flu vaccine causes flu, and amongst that group are some who are very worried about the side effects of vaccines. Correcting the misconception that the vaccine causes flu is effective in reducing belief in the myth, yet reduces the likelihood that those who are concerned about vaccines will get vaccinated ( Nyhan and Reifler, 2015 ). We observed that almost all those who said they would still use Learning Styles after completing the survey had originally said that they try to account for Learning Styles in their teaching. An interesting question for further study may be to ask, of those who are currently using Learning Styles, whether being presented with the (lack of) evidence regarding their use makes it more likely that those academics will continue to use them? In addition, it may be informative to use an in-depth qualitative approach that would allow us to understand, in detail, what it is about Learning Styles that continues to appeal.

Instead of focusing on Learning Styles, it may be more productive for all, most importantly for students, to focus on the use of teaching and development activities which are demonstrably effective. For example, the use of microteaching, a simple, multi-peer review activity, the effectiveness of which has been repeatedly demonstrated in teacher-training settings ( Yeany and Padilla, 1986 ). Only 12% of survey participants here stated that they had used microteaching within the last 12 months, yet to do so would be relatively straightforward; it is little more than the application of a few more peers to an episode of peer-observation; something that is routinely undertaken by academics in UK Higher Education. This finding may be confounded by participants simply not being aware that ‘microteaching’ means, basically, ‘multi-peer observation and feedback,’ although this was explained twice in the survey itself.

Further support for an approach focused on raising awareness comes from our finding ( Figure ​ Figure1 1 ) that, as a group, participants stated use of different teaching methods mapped directly on to their perceived usefulness (e.g., the most commonly used technique was formative assessment which was also perceived as the most effective). It seems reasonable to infer a causative relationship between these two observations, i.e., that participants use techniques which they consider to be effective, and thus if we can raise awareness of techniques which are demonstrably effective, then their use will increase.

There are some limitations to our study. A review of factors associated with dropouts from online surveys ( Galesic, 2006 ) observed that the average dropout rate amongst general-invitation online surveys (such as this one) is ∼30%, and so our dropout rate is entirely within expectations, although upon reflection we could perhaps have designed the instrument in a way that reduced dropout. A number of factors are associated with higher dropout rates, including the participant’s level of interest in the topic and the presence of ‘matrix questions.’ As described in the methods, we deliberately avoid entitling the survey as being about ‘Learning Styles’ to avoid biasing the responses, and a detailed analysis of the participation rate for each question revealed that the majority of dropouts occurred very early in the survey, after being asked to rank the effectiveness of the five teaching methods; a question potentially requiring higher effort than the others. An additional point reviewed by Galesic (2006) is the evidence that the quality of responses tails off for the items preceding the actual dropout point, thus the fact that participation rate remained steady after this early dropout is reassuring. It would also have been helpful to have a larger sample size. Although ours was equivalent to that in similar studies ( Dekker et al., 2012 ; Dandy and Bendersky, 2014 ) we may have been able to tease out more detail from the responses with a larger sample size, for example to determine whether ‘belief’ in Learning Styles was associated with any of the demographics factors (e.g., subject discipline, or age) to get a deeper understanding of why and where Learning Styles persist.

In summary, we found that 58% of academics in UK Higher Education believe that Learning Styles are effective, but only about a third actually use them, a lower percentage than use other, demonstrably evidence-based techniques. Ninety percent of academics agreed that there is a basic conceptual flaw with Learning Styles Theory. These data suggest that, although there is an ongoing controversy about Learning Styles, their actual use may be low, and further attempts to educate colleagues about this limitation might best focus on the fundamental conceptual limitations of Learning Styles theory. However, approximately one-third of academics stated that they would continue to use Learning Styles despite being presented with all the evidence. Thus it may be better still to focus on the promotion of techniques that are demonstrably effective.

Author Contributions

PN conceived the study, PN and MM designed the questionnaire, PN piloted and distributed the questionnaire, PN and MM analyzed the data, PN wrote the manuscript.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

The authors would like to thank those colleagues who distributed the survey at their institutions, and Helen Davies from the Swansea Academy of Learning and Teaching for support with Limesurvey TM .

Supplementary Material

The Supplementary Material for this article can be found online at: http://journal.frontiersin.org/article/10.3389/fpsyg.2017.00444/full#supplementary-material

  • Black C. (2016). Science/Fiction HOW LEARNING STYLES BECAME A MYTH . Available at: http://carolblack.org/science-fiction/ [ Google Scholar ]
  • Braun V., Clarke V. (2006). Using thematic analysis in psychology. Qual. Res. Psychol. 3 77–101. 10.1191/1478088706qp063oa [ CrossRef ] [ Google Scholar ]
  • Coffield F., Moseley D., Hall E., Ecclestone K. (2004). Learning Styles and Pedagogy in Post 16 Learning: A Systematic and Critical Review. The Learning and Skills Research Centre . Available at: http://localhost:8080/xmlui/handle/1/273 [ Google Scholar ]
  • Dandy K., Bendersky K. (2014). Student and faculty beliefs about learning in higher education: implications for teaching. Int. J. Teach. Learn. High. Educ. 26 358–380. [ Google Scholar ]
  • Dekker S., Lee N. C., Howard-Jones P., Jolles J. (2012). Neuromyths in education: prevalence and predictors of misconceptions among teachers. Front. Psychol. 3 : 429 10.3389/fpsyg.2012.00429 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Felder R. M. (2010). ARE LEARNING STYLES INVALID? (HINT: NO!) . Available at: http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/LS_Validity%28On-Course%29.pdf [ Google Scholar ]
  • Fleming N. D. (2012). The Case Against Learning Styles. Available at: http://vark-learn.com/wp-content/uploads/2014/08/The-Case-Against-Learning-Styles.pdf [ Google Scholar ]
  • Galesic M. (2006). Dropouts on the web: effects of interest and burden experienced during an online survey. J. Off. Stat. 22 313–328. [ Google Scholar ]
  • Geake J. (2008). Neuromythologies in education. Educ. Res. 50 123–133. 10.1080/00131880802082518 [ CrossRef ] [ Google Scholar ]
  • Goldhill O. (2016). The Concept of Different “learning Styles” Is One of the Greatest Neuroscience Myths — Quartz. Available at: http://qz.com/585143/the-concept-of-different-learning-styles-is-one-of-the-greatest-neuroscience-myths/ [ Google Scholar ]
  • Hattie J. A. C. (2009). Visible Learning: A Synthesis of over 800 Meta-Analyses Relating to Achievement. London: Routledge. [ Google Scholar ]
  • Howard-Jones P. A. (2014). Neuroscience and education: myths and messages. Nat. Rev. Neurosci. 15 817–824. 10.1038/nrn3817 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lilienfeld S. O., Lynn S. J., Ruscio J., Beyerstein B. L. (2011). 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions about Human Behavior. Hoboken, NJ: John Wiley & Sons. [ Google Scholar ]
  • Nederhof A. J. (1985). Methods of coping with social desirability bias: a review. Eur. J. Soc. Psychol. 15 263–280. 10.1002/ejsp.2420150303 [ CrossRef ] [ Google Scholar ]
  • Newton P. M. (2015). The learning styles myth is thriving in higher education. Educ. Psychol. 6 : 1908 10.3389/fpsyg.2015.01908 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Nyhan B., Reifler J. (2015). Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine 33 459–464. 10.1016/j.vaccine.2014.11.017 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Pashler H., McDaniel M., Rohrer D., Bjork R. (2008). Learning styles: concepts and evidence. Psychol. Sci. Public Interest 9 105–119. 10.1111/j.1539-6053.2009.01038.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Pasquinelli E. (2012). Neuromyths: Why do they exist and persist? Mind Brain Educ. 6 89–96. 10.1111/j.1751-228X.2012.01141.x [ CrossRef ] [ Google Scholar ]
  • Rato J. R., Abreu A. M., Castro-Caldas A. (2013). Neuromyths in education: What is fact and what is fiction for portuguese teachers? Educ. Res. 55 441–453. 10.1080/00131881.2013.844947 [ CrossRef ] [ Google Scholar ]
  • Riener C., Willingham D. (2010). The myth of learning styles. Change 42 32–35. 10.1080/00091383.2010.503139 [ CrossRef ] [ Google Scholar ]
  • Rohrer D., Pashler H. (2012). Learning styles: Where’s the evidence? Med. Educ. 46 634–635. 10.1111/j.1365-2923.2012.04273.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sax L. J., Gilmartin S. K., Bryant A. N. (2003). Assessing response rates and nonresponse bias in web and paper surveys. Res. High. Educ. 44 409–432. 10.1023/A:1024232915870 [ CrossRef ] [ Google Scholar ]
  • Simmonds A. (2014). How Neuroscience Is Affecting Education: Report of Teacher and Parent Surveys. Available at: https://wellcome.ac.uk/sites/default/files/wtp055240.pdf [ Google Scholar ]
  • Singal J. (2015). One Reason the “Learning Styles” Myth Persists. Available at: http://nymag.com/scienceofus/2015/12/one-reason-the-learning-styles-myth-persists.html [ Google Scholar ]
  • Willingham D. T., Hughes E. M., Dobolyi D. G. (2015). The scientific status of learning styles theories. Teach. Psychol. 42 266–271. 10.1177/0098628315589505 [ CrossRef ] [ Google Scholar ]
  • Yeany R. H., Padilla M. J. (1986). Training science teachers to utilize better teaching strategies: a research synthesis. J. Res. Sci. Teach. 23 85–95. 10.1002/tea.3660230202 [ CrossRef ] [ Google Scholar ]

Main Navigation

  • Contact NeurIPS
  • Code of Ethics
  • Code of Conduct
  • Create Profile
  • Journal To Conference Track
  • Diversity & Inclusion
  • Proceedings
  • Future Meetings
  • Exhibitor Information
  • Privacy Policy

NeurIPS 2024, the Thirty-eighth Annual Conference on Neural Information Processing Systems, will be held at the Vancouver Convention Center

Monday Dec 9 through Sunday Dec 15. Monday is an industry expo.

recent research on learning styles

Registration

Pricing » Registration 2024 Registration Cancellation Policy » . Certificate of Attendance

Our Hotel Reservation page is currently under construction and will be released shortly. NeurIPS has contracted Hotel guest rooms for the Conference at group pricing, requiring reservations only through this page. Please do not make room reservations through any other channel, as it only impedes us from putting on the best Conference for you. We thank you for your assistance in helping us protect the NeurIPS conference.

Announcements

  • The call for High School Projects has been released
  • The Call For Papers has been released
  • See the Visa Information page for changes to the visa process for 2024.

Latest NeurIPS Blog Entries [ All Entries ]

Important dates.

If you have questions about supporting the conference, please contact us .

View NeurIPS 2024 exhibitors » Become an 2024 Exhibitor Exhibitor Info »

Organizing Committee

General chair, program chair, workshop chair, workshop chair assistant, tutorial chair, competition chair, data and benchmark chair, diversity, inclusion and accessibility chair, affinity chair, ethics review chair, communication chair, social chair, journal chair, creative ai chair, workflow manager, logistics and it, mission statement.

The Neural Information Processing Systems Foundation is a non-profit corporation whose purpose is to foster the exchange of research advances in Artificial Intelligence and Machine Learning, principally by hosting an annual interdisciplinary academic conference with the highest ethical standards for a diverse and inclusive community.

About the Conference

The conference was founded in 1987 and is now a multi-track interdisciplinary annual meeting that includes invited talks, demonstrations, symposia, and oral and poster presentations of refereed papers. Along with the conference is a professional exposition focusing on machine learning in practice, a series of tutorials, and topical workshops that provide a less formal setting for the exchange of ideas.

More about the Neural Information Processing Systems foundation »

  • Technical Support
  • Find My Rep

You are here

Preparing your manuscript.

What are you submitting? The main manuscript document The title page How do I format my article? Sage Author Services

What are you submitting? 

Sage journals publish a variety of different article types, from original research, review articles, to commentaries and opinion pieces. Please view your chosen journal’s submission guidelines for information on what article types are published and what the individual requirements are for each. Below are general guidelines for submitting an original research article. 

Whatever kind of article you are submitting, remember that the language you use is important. We are committed to promoting equity throughout our publishing program, and we believe that using language is a simple and powerful way to ensure the communities we serve feel welcomed, respected, safe, and able to fully engage with the publishing process and our published content. Inclusive language considerations are especially important when discussing topics like age, appearance, disability, ethnicity, gender, gender identity, race, religion, sexual orientation, socioeconomic status, emigration status, and weight. We have produced an Inclusive Language Guide that recommends preferred terminology on these topics. We recognize that language is constantly evolving and we’re committed to ensuring that this guide is continuously updated to reflect changing practices. The guide isn't exhaustive, but we hope it serves as a helpful starting point.  

The main manuscript document 

Have a look at your chosen journal’s submission guidelines for information on what sections should be included in your manuscript. Generally there will be an Abstract, Introduction, Methodology, Results, Discussion, Conclusion, Acknowledgments, Statements and Declarations section, and References. Be sure to remove any identifying information from the main manuscript if you are submitting to a journal that has a double-anonymized peer review policy and instead include this on a separate title page. See the Sage Journal Author Gateway for detailed guidance on making an anonymous submission .   

Your article title, keywords, and abstract all contribute to its position in search engine results, directly affecting the number of people who see your work. For details of what you can do to influence this, visit How to help readers find your article online .

Title: Your manuscript’s title should be concise, descriptive, unambiguous, accurate, and reflect the precise contents of the manuscript. A descriptive title that includes the topic of the manuscript makes an article more findable in the major indexing services.  

Abstract: Your abstract should concisely state the purpose of the research, major findings, and conclusions. If your research includes clinical trials, the trial registry name and URL, and registration number must be included at the end of the abstract. Submissions that do not meet this requirement will not be considered. Please see your chosen journal’s guidelines for information on how to set out your abstract.  

Keywords: You will be asked to list a certain number of keywords after the abstract. Keywords should be as specific as possible to the research topic.   

Acknowledgements: If you are including an Acknowledgements section, this will be published at the end of your article. The Acknowledgments section should include all contributors who do not meet the criteria for authorship. Per ICMJE recommendations , it is best practice to obtain consent from non-author contributors who you are acknowledging in your manuscript.   

Writing assistance and third-party submissions: if you have received any writing or editing assistance from a third-party, for example a specialist communications company, this must be clearly stated in the Acknowledgements section and in the covering letter. Please see the Sage Author Gateway for what information to include in your Acknowledgements section. If your submission is being made on your behalf by someone who is not listed as an author, for example the third-party who provided writing/editing assistance, you must state this in the Acknowledgements and also in your covering letter. Please note that the journal editor reserves the right to not consider submissions made by a third party rather than by the author/s themselves.   

Author contributions statement: As part of our commitment to ensuring an ethical, transparent and fair peer review and publication process, some journals have adopted CRediT (Contributor Roles Taxonomy) . CRediT is a high-level taxonomy, including 14 roles, which is used to describe each author’s individual contributions to the work. Other journals may require you to list the contribution of each author as part of the submission process. If so, please include an Author Contributions heading within your submission after the Acknowledgements section. The information you give on submission will then show under the Author Contributions heading later at the proofing stage.  

Statements and declarations: You’ll be asked to provide various statements and declarations regarding the research you’re submitting. These will vary by journal so do make sure you read your chosen journal’s guidelines carefully to see what is required. Please include a section with the heading ‘Statements and Declarations’ at the end of your submitted article, after the Acknowledgements section (and Author Contributions section if applicable) including the relevant sub-headings listed below. If a declaration is not applicable to your submission, you must still include the heading and state ‘Not applicable’ underneath. Please note that you may be asked to justify why a declaration was not applicable to your submission by the Editorial Office.

  • Ethical considerations: Please include your ethics approval statements under this heading, even if you have already included ethics approval information in your methods section. If ethical approval was not required, you need to state this. You can find information on what to say in your ethical statements as well as example statements on our Publication ethics and research integrity policies page    
  • Consent to participate: Please include any participant consent information under this heading and state whether informed consent to participate was written or verbal. If the requirement for informed consent to participate has been waived by the relevant Ethics Committee or Institutional Review Board (i.e. where it has been deemed that consent would be impossible or impracticable to obtain), please state this. If this is not applicable to your manuscript, please state ‘Not applicable’ in this section. More information and example statements can be found on our Publication ethics and research integrity policies page   
  • Consent for publication: Submissions containing any data from an individual person (including individual details, images or videos) must include a statement confirming that informed consent for publication was provided by the participant(s) or a legally authorized representative. Non-essential identifying details should be omitted.  Please do not submit the participant’s actual written informed consent with your article, as this in itself breaches the patient’s confidentiality. The Journal requests that you confirm to us, in writing, that you have obtained written informed consent to publish but the written consent itself should be held by the authors/investigators themselves, for example in a patient’s hospital record. The confirmatory letter may be uploaded with your submission as a separate file in addition to the statement confirming that consent to publish was obtained within the manuscript text. If this is not applicable to your manuscript, please state ‘Not applicable’ in this section. If you need one you can download this template participant consent form . 
  • Declaration of conflicting interest: All journals require a declaration of conflicting interests from all authors so that a statement can be included in your article. For guidance on conflict of interest statements, see our policy on conflicting interest declarations and the ICMJE recommendations . If no conflict exists, your statement should read: The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
  • Funding statement: All articles need to include a funding statement, under a separate heading, even if you did not receive funding .  You’ll find guidance and examples on our Funding statements page .  
  • Data availability statement: We are committed to helping ensure you reach as many readers as possible, always in a spirit of openness and transparency. We encourage you to share your research to a public repository and cite this data in your research (please note that this is a requirement for some journals). You will need to publish a data availability statement with your article under this heading. More information on how to write one can be found on the Sage Gateway: Research Data Sharing FAQs | SAGE Publications Ltd   

Artwork, figures, and other graphics: Illustrations, pictures and graphs, should be supplied in the highest quality and in an electronic format that helps us to publish your article in the best way possible. Please follow the guidelines below to enable us to prepare your artwork for the printed issue as well as the online version. 

  • Format: TIFF, JPEG: Common format for pictures (containing no text or graphs). 
  • EPS: Preferred format for graphs and line art (retains quality when enlarging/zooming in). 
  • Placement: Figures/charts and tables created in MS Word should be included in the main text rather than at the end of the document. 
  • Figures and other files created outside Word (i.e. Excel, PowerPoint, JPG, TIFF and EPS) should be submitted separately. Please add a placeholder note in the running text (i.e. “[insert Figure 1.]") 
  • Resolution: Rasterized based files (i.e. with .tiff or .jpeg extension) require a resolution of at least 300 dpi (dots per inch). Line art should be supplied with a minimum resolution of 800 dpi. 
  • Colour: Please note that images supplied in colour will be published in colour online and black and white in print (unless otherwise arranged). Therefore, it is important that you supply images that are comprehensible in black and white as well (i.e. by using colour with a distinctive pattern or dotted lines). The captions should reflect this by not using words indicating colour. If you have requested colour reproduction in the print version, we will advise you of any costs on receipt of your accepted article. 
  • Dimension: Check that the artworks supplied match or exceed the dimensions of the journal. Images cannot be scaled up after origination 
  • Fonts: The lettering used in the artwork should not vary too much in size and type (usually sans serif font as a default). 

Please ensure that you have obtained any necessary permission from copyright holders for reproducing any illustrations, tables, figures, or lengthy quotations previously published elsewhere. For further information including guidance on fair dealing for criticism and review, please see the Frequently Asked Questions page on the Sage Journal Author Gateway.   

References: Every in-text citation must have a corresponding citation in the reference list and vice versa. Corresponding citations must have identical spelling and year. Information about what reference style to use can be found in your chosen journal’s guidelines. 

Authors should update any references to preprints when a peer reviewed version is made available, to cite the published research. Citations to preprints are otherwise discouraged.  

Supplemental material Sage journals can host additional materials online (e.g. datasets, podcasts, videos, images etc.) alongside the full text of the article. Your supplemental material must be one of our accepted file types. For that list and more information please refer to our guidelines on submitting supplemental files .  

The title page  

You will also need to prepare a title page. This should include any information removed from the main manuscript document for the purposes of anonymity. The title page will not be sent to peer reviewers.  

Your title page should include:  

  • Article title  
  • The full list of authors including all names and affiliations. 
  • The listed affiliation should be the institution where the research was conducted. If an author has moved to a new institution since completing the research, the new affiliation can be included in a note at the end of the manuscript – please indicate this on the title page.  
  • Everybody eligible for authorship must be included at the time of submission (please see the authorship section for more information).
  • Contact information for the corresponding author: name, institutional address, phone, email  
  • Acknowledgments section  
  • Statements and Declarations section  
  • Any other identifying information related to the authors and/or their institutions, funders, approval committees, etc, that might compromise anonymity.   

How do I format my article? 

The preferred format is Word. There is no need to follow a specific template when submitting your manuscript in Word. However, please ensure your heading levels are clear, and the sections clearly defined. 

(La)TeX guidelines We welcome submissions of LaTeX files. Please download the  Sage LaTex Template , which contains comprehensive guidelines. The Sage LaTex template files are also available in  Overleaf , should you wish to write in an online environment. 

If you have used any .bib or .bst files when creating your article, please include these with your submission so that we can generate the reference list and citations in the journal-specific style. If you have any queries, please consult our  LaTex Frequently Asked Questions.  

When formatting your references, please ensure you check the reference style followed by your chosen journal. Here are quick links to the  Sage Harvard  reference style, the  Sage Vancouver  reference style and the  APA  reference style. 

Other styles available for certain journals are:  ACS Style Guide ,  AMA Manual of Style ,  ASA Style Guide ,  Chicago Manual of Style  and  CSE Manual for Authors, Editors, and Societies . 

Please refer to  your journal's manuscript submission guidelines  to confirm which reference style it conforms to and for other specific requirements. 

Equations should to be submitted using Office Math ML and Math type. 

Artwork guidelines   Illustrations, pictures and graphs, should be supplied in the highest quality and in an electronic format that helps us to publish your article in the best way possible. Please follow the guidelines below to enable us to prepare your artwork for the printed issue as well as the online version. 

  • Format:  TIFF, JPEG: Common format for pictures (containing no text or graphs).  EPS: Preferred format for graphs and line art (retains quality when enlarging/zooming in). 
  • Placement:  Figures/charts and tables created in MS Word should be included in the main text rather than at the end of the document.  Figures and other files created outside Word (i.e. Excel, PowerPoint, JPG, TIFF and EPS) should be submitted separately. Please add a placeholder note in the running text (i.e. “[insert Figure 1.]") 
  • Resolution:  Rasterized based files (i.e. with .tiff or .jpeg extension) require a resolution of at least  300 dpi  (dots per inch). Line art should be supplied with a minimum resolution of  800 dpi . 
  • Color:  Please note that images supplied in colour will be published in color online and black and white in print (unless otherwise arranged). Therefore, it is important that you supply images that are comprehensible in black and white as well (i.e. by using color with a distinctive pattern or dotted lines). The captions should reflect this by not using words indicating colour. 
  • Dimension:  Check that the artworks supplied match or exceed the dimensions of the journal. Images cannot be scaled up after origination 
  • Fonts:  The lettering used in the artwork should not vary too much in size and type (usually sans serif font as a default). 

Image integrity Figures should be minimally processed and should reflect the integrity of the original data in the image. Adjustments to images in brightness, contrast, or color balance should be applied equally to the entire image, provided they do not distort any data in the figure, including the background. Selective adjustments and touch-up tools used on portions of a figure are not appropriate. Images should not be layered or combined into a single image unless it is stated that the figure is a product of time-averaged data. All adjustments to image date should be clearly disclosed in the figure legend. Images may be additionally screened to confirm faithfulness to the original data. Authors should be able to supply raw image data upon request. Authors should also list tools and software used to collect image data and should document settings and manipulations in the Methods section. 

Sage Author Services 

Authors seeking assistance with English language editing, translation with editing, or figure and manuscript formatting, to fit the journal’s specifications should consider using Sage Author Services. Other additional services include creation of infographics and video summaries to promote your article with colleagues and over social media. Visit  Sage Author Services  on our Journal Author Gateway for further information. 

  • Open access at Sage
  • Top reasons to publish with Sage
  • How to get published
  • Open access and publishing fees
  • Sage Author Services
  • Help readers find your article
  • Plain Language Summaries
  • Inclusive language guide
  • Registered reports author guidelines
  • Publication ethics policies
  • Supplemental material author guidelines
  • Manuscript preparation for double-anonymized journals
  • Advance: a Sage preprints community
  • Submitting your manuscript
  • During peer review
  • During and post publication
  • Sage editorial policies
  • Help and support
  • Journal Editor Gateway
  • Journal Reviewer Gateway
  • Ethics & Responsibility
  • Publication Ethics Policies
  • Sage Chinese Author Gateway 中国作者资源

IMAGES

  1. (PDF) Is learning styles-based instruction effective? A comprehensive

    recent research on learning styles

  2. The 3 Learning Styles Infographic

    recent research on learning styles

  3. 💌 Research on different learning styles. Learning Styles. 2022-11-13

    recent research on learning styles

  4. How Understanding Learning Styles Can Make You a Better Learner

    recent research on learning styles

  5. The Effect of the Teacher’s Learning Style on Students

    recent research on learning styles

  6. What is your Learning Style?

    recent research on learning styles

VIDEO

  1. MODULE 5 LEARNING/THINKING STYLES AND MULTIPLE INTELLIGENCES

  2. Four Types of Learning styles in Education

  3. Learning Styles for Adults and Teenagers

  4. Learning Styles Myth, part 1. By Tracey Tokuhama-Espinosa, Ph.D:

  5. Using Learning Styles to Meet Objectives in the Classroom

  6. Online Course Learning Styles

COMMENTS

  1. Is learning styles-based instruction effective? A comprehensive

    Since current thinking on learning styles often stems from Gardner's hypothesis, it would follow that learning styles instruction may also lack validity. ... Learning styles research has been popular in the field of educational technology, most likely because technology may expand the possibilities for delivering content in a variety of modes.

  2. New review says ineffective 'learning styles' theory persists in

    For decades educators have been advised to match their teaching to the supposed 'learning styles' of students. There are more than 70 different classification systems, but the most well-known (VARK) sees individuals being categorised as visual, auditory, read-write or kinesthetic learners.

  3. Beware the myth: learning styles affect parents', children's, and

    The current study addresses this issue by examining whether teachers, parents, and children think those described as having certain learning styles are smarter than others (i.e., learning better ...

  4. (PDF) Learning styles: A detailed literature review

    The literature review shows several studies on a variety of le. arning styles-interactive, social, innovative, experiential, game-based, self-regulated, integrated, and expeditionary le. arning ...

  5. Identifying learning styles and cognitive traits in a learning

    Recent developments in online learning renewed the interest in learner behavior modeling in an LMS. According to Blakemore et al. (1984), learning style indicates the way a learner observes, interacts with, and answer back to learning content. Several examples of learning style models exist in the literature.

  6. Providing Instruction Based on Students' Learning Style Preferences

    The Current Study. Given that learning styles-based instruction is most highly targeted to the K-12 environment, coupled with the importance of verbal comprehension on educational outcomes, this study investigated the learning styles hypothesis, specifically as it pertains to the auditory/visual dichotomy in 5th graders.

  7. Is learning styles-based instruction effective? A comprehensive

    Joshua Cuevas is an Assistant Professor in the College of Education at the University of North Georgia where he teaches courses in research methods, assessment, educational psychology, and literacy, as well as overseeing graduate-level research studies. He received a PhD in educational psychology from Georgia State University. Prior to that, he worked in assessment at the state level at the ...

  8. "Visual Type? Not My Type": A Systematic Study on the Learning Styles

    The term learning styles (LS) describes the notion that individuals have a preferred modality of learning and that presenting to-be-learned material in this modality results in optimal learning for them (Pashler, McDaniel, Rohrer, & Bjork, 2008).To date, studies around the globe suggest that LS is very popular among preservice and in-service teachers, including school principals, student ...

  9. Learning Styles Debunked: There is No Evidence Supporting Auditory and

    Any experiment designed to test the learning-styles hypothesis would need to classify learners into categories and then randomly assign the learners to use one of several different learning methods, and the participants would need to take the same test at the end of the experiment. ... > Latest Research News > Learning Styles Debunked: There is ...

  10. Learning Styles Are Out of Style: Shifting to Multimodal Learning

    The authors encourage primary educators to reevaluate learning styles and related literature, explore how the mind works in children's learning, and consider the multimodal learning approach in their classroom practices. ... His research interests include early childhood education and development, early literacy skills acquisition ...

  11. An integrative debate on learning styles and the learning process

    From these assumptions, Kolb (1984) then presents an explanatory structural model of learning and an instrument of identification of learning styles, directed to the formation of adult professionals, known as the Kolb cycle, as can be seen in Fig. 1. Download : Download full-size image Fig. 1. Kolb's experiential learning cycle. Source: adapted from Kolb (1984), Kolb and Kolb (2013).

  12. Learning Styles: A Review of Theory, Application, and Best Practices

    LEARNING STYLES. A benchmark definition of "learning styles" is "characteristic cognitive, effective, and psychosocial behaviors that serve as relatively stable indicators of how learners perceive, interact with, and respond to the learning environment. 10 Learning styles are considered by many to be one factor of success in higher education. . Confounding research and, in many instances ...

  13. The Problem with "Learning Styles"

    A recent review of the scientific literature on learning styles found scant evidence to clearly support the idea that outcomes are best when instructional techniques align with individuals ...

  14. Learning Styles

    The term learning styles is widely used to describe how learners gather, sift through, interpret, organize, come to conclusions about, and "store" information for further use. As spelled out in VARK (one of the most popular learning styles inventories), these styles are often categorized by sensory approaches: v isual, a ural, verbal [ r ...

  15. PDF Beware the myth: learning styles affect parents', children's, and

    learning styles as real phenomena, which has sought to uncover "field-specific" beliefs about learning styles and their link to academic success (e.g., STEM learning; medicine21,22). Inherent in

  16. Adaptive e-learning environment based on learning styles ...

    Adaptive e-learning is viewed as stimulation to support learning and improve student engagement, so designing appropriate adaptive e-learning environments contributes to personalizing instruction to reinforce learning outcomes. The purpose of this paper is to design an adaptive e-learning environment based on students' learning styles and study the impact of the adaptive e-learning environment ...

  17. Is learning styles-based instruction effective? A comprehensive

    of recent research on learning styles Joshua Cuevas University of North Georgia, USA Abstract In an influential publication in 2009, a group of cognitive psychologists revealed that there was a lack of empirical evidence supporting the concept of learning styles-based instruction

  18. Is learning styles-based instruction effective? A comprehensive

    In an influential publication in 2009, a group of cognitive psychologists revealed that there was a lack of empirical evidence supporting the concept of learning styles-based instruction and provided guidelines for the type of research design necessary to verify the learning styles hypothesis. This article examined the literature since 2009 to ascertain whether the void has been filled by ...

  19. Learning Styles: An overview of theories, models, and measures

    Increasingly, research in the area of learning style is being conducted in domains outside psychology—the discipline from which many of the central concepts and theories originate. These domains include medical and health care training, management, industry, vocational training and a vast range of settings and levels in the field of education ...

  20. Differentiating the learning styles of college students in different

    The recent literature on learning styles mostly focuses on the exploration of the disciplinary effects on the variation in learning styles, and some of these studies were conducted within the blended environment. ... In learning style research, it is difficult to select an instrument to measure the subjects' learning styles . The criteria ...

  21. Learning Styles: Lack of Research-Based Evidence

    Despite decades of research showing that learning styles theory is a neuromyth, the practice continues. There is no empirical research that shows matching a student's preferred learning style to instruction produces better learning outcomes. In fact, there is no correlation between learning style and academic performance.

  22. Constructivist Learning Theory and Creating Effective Learning

    The term 'learning style' is used widely in education and training to refer to a range of ... Current research suggests that effective pedagogues are those who: demonstrate a mastery of knowledge. show enthusiasm. set realistic lesson objectives and outcomes. have high, rather than low, students' expectations. provide frequent positive ...

  23. Evidence-Based Higher Education

    A recent study demonstrated that current research papers 'about' Learning Styles, in the higher education research literature, overwhelmingly endorsed their use despite the lack of evidence described above (Newton, 2015). Most of this endorsement was implicit and most of the research did not actually test Learning Styles, rather proceeded ...

  24. 2024 Conference

    The conference was founded in 1987 and is now a multi-track interdisciplinary annual meeting that includes invited talks, demonstrations, symposia, and oral and poster presentations of refereed papers. Along with the conference is a professional exposition focusing on machine learning in practice, a series of tutorials, and topical workshops ...

  25. The Effects of Learning Style-Based Differentiated Instructional

    Learning style refers to a student's attempt to choose and use suitable ways to learn new information . Learning style is characterized by individual preferences and methods to perceive and process the learning material (S. Altun, 2005). The common point of learning style definitions is the learner preferences based on personal ...

  26. Learning font-style space using style-guided discriminator for few-shot

    For learning distinct font styles, most methods rely on multitask discriminators that use style labels. Such dependency on style labels ignores the semantic distances between different font styles. In this paper, we tackle this font synthesis problem by learning the font style in the embedding space.

  27. GPT-4

    Longer context. GPT-4 is more creative and collaborative than ever before. It can generate, edit, and iterate with users on creative and technical writing tasks, such as composing songs, writing screenplays, or learning a user's writing style. Input. Explain the plot of Cinderella in a sentence where each word has to begin with the next ...

  28. Preparing your manuscript

    Other styles available for certain journals are: ACS Style Guide, AMA Manual of Style, ASA Style Guide, Chicago Manual of Style and CSE Manual for Authors, Editors, and Societies. Please refer to your journal's manuscript submission guidelines to confirm which reference style it conforms to and for other specific requirements.