Very few students are graduating high school writing at a college level according to a report put together by Mt. SAC in December ahead of their spring accreditation review.
In a typical English 1A course at Mt. SAC, only about five to six students will be in their first college English course. That’s because less than 20 percent of incoming students are placing into English 1A.
A vast majority – nearly 80 percent – of students placed above the English as a Second Language levels, but below college-level English 1A, in 2015.
All incoming students who plan on graduating with a degree or taking a class with an English prerequisite are required to take the Assessment of Written English. The AWE is a prompt that students have 45 minutes to compose an essay in response to.
Two readers then grade the essay and determine where the student should be placed. Students can be placed in English 1A – which is considered college level – or English 67, English 68, LERN 81 or a number of credit and noncredit ESL courses – which are considered basic skills courses and not college-level writing.
The accreditation report, which includes students coming out of high school as well as those who took time off or are returning as adults, showed that between 2011 and 2015, only about 10 percent of students were placed into the college-level English 1A course. Over 80 percent, however, were placed into either English 67, English 68, or LERN 81, while about 8 percent of students were placed into any of the ESL courses over that five-year span.
However, those numbers have improved during that time. In 2011, only 3 percent of those who took the AWE were placed into English 1A, compared to 14.3 percent in 2015.
That’s because in 2012, the English department changed the rubric that graders use to place students.
Margie Whalen, chair of the English department, said that the previous rubric had grown over time until it consisted of many different segments that readers had to look for. As a result, she said, graders could too easily focus on a few of the elements and mistakenly place an essay lower because it did not satisfy the requirements they were looking for even if it satisfied other elements of the rubric.
“A team of us worked on simplifying the rubric and making sure that it reflected a kind of holistic approach instead of looking at multiple discreet elements,” Whalen said.
The new rubric asks graders to simply recognize whether the essay is well-organized, answers the prompt and effectively communicates the message the author is trying to get across.
Alex Madrigal, a 20-year-old aeronautical engineering major, said that he thought the AWE gave average writers an opportunity to write a strong essay because it is not research-based or one that requires introducing quotations and analyzing them.
“[The AWE] lets you see what kind of person it is [writing it],” Madrigal said.
He placed into English 1A despite the fact that he does not feel he is a strong writer.
In the first year of the new rubric, placement into English 1A more than doubled. In 2015, almost five times more students were placed into English 1A than in 2011, the last year the old rubric was used.
Whalen, now 62, said other numbers showed that increase was due to a more accurate assessment of students rather than the assessment becoming easier for students to be placed higher. Because pass rates in English 1A classes remained the same, she said, students capable of succeeding in English 1A had previously been getting placed into lower levels.
Similar numbers were seen in those lower levels as well. Before the rubric change, a vast majority of students were placed into English 67. In 2015, more students were placed into English 68 than English 67, and the percentage of students placed into LERN 81 during that time span was cut nearly in half.
While students are being more accurately assessed, still less than one-fifth of incoming students are writing at a college level. That, Whalen said, could be for a number of reasons.
First, Whalen said that incoming students may not be used to the style of writing that the AWE asks students to do.
“For pretty solid high school students, they’ve probably been writing academic papers with no personal voice, you know, no use of the personal pronoun,” she said. “And our AWE has traditionally been, ‘Tell us about a time that you had a problem. What was the problem and how did you solve it?’ And so it’s a kind of writing that some of them may simply not have done. If they’ve been drilled on a five-paragraph essay, and if we’re asking for a piece that could be three paragraphs and be fine, then I think that’s also an issue.”
In high school, 20-year-old child development major Cassandra Rosales took AP English courses, but had to take the AWE when she started at Mt. SAC because she did not take the AP exam for college credit. When she took the AWE, she placed into English 67.
“I thought it was pretty unfair,” she said. “I thought I had done better.”
Rosales said that when she talked to other high-performing high school students, they told similar stories of being placed lower than they felt they should have been.
That high school students can sometimes struggle with the AWE could be related to the large number of subjects high school courses must cover. Because teachers must teach to state and local standards, they do not have the ability to focus on simply what local college English classes might cover.
“If you think about an English course in high school, it’s not just writing. Our whole focus is reading texts and writing about them,” Whalen said. “In an English class in high school, you’re studying [literature]. You’re doing some writing. You’re doing some grammar. … If you think of the breadth of things that a high school class might cover, academic writing – closely-graded academic writing – is probably not as central to those courses as it is to ours.”
Another potential issue Whalen mentioned is that students may not understand the implications of the assessment. Today, in addition to normal academic testing such as spelling and grammar tests and chapter tests, students are taking standardized tests, benchmark tests, the PSAT and SAT or ACT, and the CAHSEE, among others. Students may see the AWE as just another 45-minute test and not pay it much attention.
Dyese Lee, a 24-year-old film major, didn’t put much preparation into the AWE because she felt as though she was a strong writer. She placed into English 1A, but test anxiety made it difficult for her. She said she thought she was going to be placed lower.
Whalen mentioned that past students have admitted to working long shifts before taking the AWE or finishing early because they were sick or angry on the day of the assessment.
She also said that a lot of students go into the test not knowing what the prompt may be or how it is graded.
“The campus assessment website has sample questions, sample tests, [and] shows you the rubric. None of it’s secret, but nobody knows,” Whalen added. “It’s all out there, but I’m not sure students – understandably, coming from high school – know to do that.”
Whalen said that the AWE workshops provided by the Writing Center are also a very valuable tool that most students don’t take advantage of. She said that the percentage of students who improve their placement after attending a workshop is high.
Annabella Lara, a 19-year-old speech pathology major, said she didn’t know that there was a workshop available for the AWE, but that she probably would not have attended it if she had known.
She said she expected the AWE would be easy, and felt as though she had done well. She placed into English 68, but felt like she was placed lower than she should have been.
Rosales also did not know about the workshop, but said the workshop probably would have helped her to know what the prompt would be like beforehand. Lee said attending the workshop probably would have prevented her test anxiety during the assessment.
There are other forms of assessments used around the state. Multiple choice tests are growing in popularity. ACCUPLACER, a math, reading and writing assessment produced by College Board, is one such test being adopted by colleges which Whalen said comes with its own set of problems.
Among them, the test does not require students to do what they will spend most of their time doing in a college English class: write.
Another measure that is being used more widely across the state and is being considered to factor prominently in future Mt. SAC assessment decisions is high school GPA. While Whalen said GPA cannot be used for every student’s case and comes with its own set of challenges, it can still be a somewhat accurate way to measure how prepared a student may be to perform well in any college course.
“The thing about high school GPA that’s interesting is that it does say something about what the student knows about how to be a student, because students struggle in earlier classes – or all classes, really – because of skills, because of life, and because of those soft student skills,” she said. “You know, [some students] just don’t want to be there. Don’t really get that you have to do the reading … A high school GPA at least is predictive in that it shows whether the student has those soft skills, so we’re in the early stages across the campus of looking pretty hard at that.”
Still, Whalen said GPA should not be the only form of assessing a student’s potential to succeed at the college level, especially because community colleges are filled with students who took a break from school for a few years, or those who didn’t care too much for high school but come into college ready to start fresh.
The California Community Colleges system as a whole is also moving towards creating an assessment test to be used at each of the member schools. This common assessment test, named CCCAssess, began as a project to prevent forcing students to take multiple placement tests if they’re not sure which school they’ll attend or move schools before taking an English course, and to make assessing students more universal throughout the state.
“Which in theory is good, except that the placement instruments are so imperfect that having all of us use a crappy multiple choice test has deep problems,” Whalen said. “Plus, the profit motive for the testing company that gets to make the test for all the California schools leaves some of us fairly suspicious.”
The different types of assessment tests – multiple choice, essay – combined with the other forms of measuring students – high school GPA, student questionnaires – are all designed to place students where they have the greatest chance of succeeding.
Placing a student too high could result in overwhelming them, leading them to fail and possibly give up on English or school altogether, while placing them too low could cause them to take more classes than necessary, possibly resulting in paying more to graduate and burning out after one or two lower-level courses.
“Here’s the thing: when teacher’s talk, we tend to talk about the student in the room who wasn’t ready. ‘I can’t believe he or she can’t do this,’ we say when we’re tired and grumpy,” Whalen said. “But what we don’t talk about very much are the six or seven students in the room who didn’t really need that [English] 67 or that [English] 68, who have made our lives easy in that class because they can do the work and they like the work, and they’re fine. We don’t talk about them.
“So I think we could be placing more students in higher levels, and they’d be doing just fine. So, we have to figure out a way to do that accurately without endangering the students who need the support and the extra coursework to get confident and competent.”