I’m sure this will vary for many people depending on their schools, where/when they were taught, and the like, so I’m interested to see what others’ experiences have been with this.
I’m also curious about what resources some have used to learn better research skills & media literacy (and found useful) if their school didn’t adequately teach either (or they may have whiffed on it at the time).
So I’m going to say no but in a way different from others here.
Technical details like libraries, even search engines, sources, quoting and citing … sure, these were at least touched on if not covered well enough.
But as someone who has gone on to do actual research at an academic level, I’d say the essential challenge of the task wasn’t even touched. Which is getting to the bottom of a question or field, exploring the material on said topic and then digesting and synthesising all of that. Some may hit this in undergrad depending on the degree, and it’s tricky work to do well and at an advanced level.
From what I’ve seen, the ideas and techniques required aren’t covered early on at all. Now it may be rather challenging at an early educational level, but I’d bet you it’s possible but undesirable because it’s hard to grade and takes a long time.
Thing is, I’d suspect trying to get practiced at that kind of work would actually be beneficial. You start to get insight into what it means to know things and to work things out. What it means to ask questions that aren’t common or not immediately answerable by Wikipedia (I recall realising in my masters that Wikipedia no longer had any utility for my research, like at all) and how there are different domains and sources and levels and techniques of both knowledge and uncertainty and mystery. Whether a young student is good at this or gets far at it, trying it for a bit and seeing the process could be valuable for everyone.
But as someone who has gone on to do actual research at an academic level, I’d say the essential challenge of the task wasn’t even touched. Which is getting to the bottom of a question or field, exploring the material on said topic and then digesting and synthesising all of that. Some may hit this in undergrad depending on the degree, and it’s tricky work to do well and at an advanced level.
From what I’ve seen, the ideas and techniques required aren’t covered early on at all. Now it may be rather challenging at an early educational level, but I’d bet you it’s possible but undesirable because it’s hard to grade and takes a long time.
Without having gone on to do actual research, but with at least undergrad completed, I’m inclined to agree. Despite having completed undergrad, even it left me wondering a fair amount how much I’d just been a terrible student or how much my education had somehow managed to sort of gloss over or speed over rather critical research skills to develop.
Sure, I knew how to search for info and kind of weigh the sources, as some others have noted, but the more involved work like you describe? Not so much, and I’m fairly confident it was as much to do with the curriculum as it was to do with the limited time each class/course had to work with (plus accounting for the fact you’d also be muddling through multiple other classes/courses), which wouldn’t necessarily even permit for assignments that would have one digging in and really researching thoroughly.
Yep, agree, and had the same feeling through undergrad.
If it helps, I’ve had the same feeling through post-grad too! The whole world is on timelines and productivity goals these days … no one is allowed the time to just explore and see where things take them.
The recent Nobel Prize for medicine (for the mRNA vaccine) being a fairly glaring indictment of how much it has maybe taken academia off course. For example, here’s a psychology professor trying to address the issue on mastodon. Another example I noticed was that any older paper I’d read, though the technology and understanding (in some cases) was obviously older and less advanced, would obviously be of a better quality compared to modern papers. The main difference was that older papers were more likely to report on the story of an investigation. There’s be assides about things they’d checked or doubts they’d had etc. Modern papers tend to lean more into “marketing” and feel more rushed and manufactured. Any colleague in similar areas to me that I’ve spoken about this has shared similar feelings. Academics are pressured to publish at nearly a breakneck speed and none of them like it. Not because it’s got them working hard (though it does have that effect through secondary affects because of just how many things academics have to do to keep the system running, including peer-review), but because they aren’t allowed to work as hard as they’d like on solving problems and actually finishing projects.
Back to the topic of education … yea I agree that curriculum and its modularity is a big part of the problem. Bottom line is, along with the above, education is manufactured now, not cultured. Allowing a student to try and inevitably fail and struggle at actual research and asking their own or at least not spoon fed questions doesn’t fit neatly into the current design philosophy of education.
Thing is, I’m not sure there is much more of a point to education than allowing and helping someone learn and experience this process. It’s as simple as the “teach a man to fish” aphorism. All of the assessment and metrics driven design of education and curriculum to make sure someone is capable of knowing something for a short window of time is a rather superficial view of what being educated is about. With AI, chatGPT etc, the specter haunting academia and the hollowness of its value proposition is looming very large IMO, but few who are around academia or who genuinely found it valuable or value it as part of the self-worth want to question it.
GCSE (14-16 year olds) history is (supposed to) teach the various types of source (primary, secondary, etc.), and consideration of reliability, bias, etc.
For the sciences, we were required to use reasonable sources (perhaps not direct papers and journals, but certainly reasonably reputable outlets that discuss their findings.
At college level (16-19), I honestly don’t remember this being a requirement (although I did drop history). Tests and assignments were mostly based on class teaching.
At university level, it goes full force after the first year. Everything you assert, you have to back up, using the university’s preferred referencing system.
What is college? Is it employed for middle school or high school? If so then no.
And if someone who didn’t get such eduction thinks they have the tool to distinguish between false and true, they are delusional. The more verified knowledge someone has, the more that person develops their ability to manage information and find if it’s bad or good. Tho that doesn’t mean that everyone has equal training or capacity in doing so.
If college is employed as a synonym of university, then kinda yes? Tho for myself I wasn’t really trained into getting the right sources. However the knowledge gained from the years of eduction allowed me to somehow manage a bit the informations.
However, I don’t think I would have been able to really avoid bad information without getting the university training, where I also learned sources and reputable sources.
Even now it can be sometimes hard to get a good source to check. And often for random info I’ll forget on the Web I don’t even bother.
What is college? Is it employed for middle school or high school? If so then no.
It’s for post-high school education, also referred to as higher education in my area. Generally they’re synonymous with universities in that respect where I’m from, and while I’m sure there may be some slight difference between the two (probably more distinct in other areas), I don’t know what they are exactly.