Tracking students through college and into the workforce is an idea whose time has come back, reports Inside Higher Ed. The Student Right to Know Before You Go Act revives a controversial idea opposed by privacy advocates and adds a federal “unit record” database administered by the Education Department.
Colleges would make information public about students’ salaries by major and program; graduation and remediation rates; success rates for students who receive a Pell Grant or veterans’ benefits; and other benchmarks not currently collected in such detail.
. . . A unit record database has long been the holy grail for many policy makers, who argue that collecting data at the federal level is the only way to get an accurate view of postsecondary education. But privacy advocates, private colleges and Congressional Republicans, all of whom oppose the creation of such a database, teamed up in opposition the last time the idea was proposed, by the Bush administration in 2005. Then, the opponents succeeded; the 2008 reauthorization of the Higher Education Act included a provision specifically forbidding the creation of a federal unit record data system.
Nearly every advocacy group, think tank, committee and panel has called for a federal unit record system, reports Inside Higher Ed. States are developing databases to track their own students, but the federal government’s Integrated Postsecondary Education Data System still ignores part-time students and counts many transfers as dropouts. As more young people “swirl” from one campus to another and yet another, IPEDS data is increasingly inadequate for policymakers.
Privacy is a phony issue, writes Reihan Salam on National Review. It’s easy to make the data anonymous. Students and their parents really do have a right to know the odds of success before they write the first tuition check, writes Salam. Reliable data on student outcomes would threaten colleges and universities that offer a substandard education and leave students in debt and without marketable skills.
When students read e-textbooks, e-books will be reading students, reports The Chronicle of Higher Education. CourseSmart, which sells digital textbooks, will provide “a new tool to help professors and others measure students’ engagement with electronic course materials.”
Say a student uses an introductory psychology e-textbook. The book will be integrated into the college’s course-management system. It will track students’ behavior: how much time they spend reading, how many pages they view, and how many notes and highlights they make. That data will get crunched into an engagement score for each student.
The idea is that faculty members can reach out to students showing low engagement, says Sean Devine, chief executive of CourseSmart. And colleges can evaluate the return they are getting on investments in digital materials.
Students will be able to opt out if they don’t want Big Teacher monitoring their reading habits, Devine said at the Educause conference. “We do understand the Big Brother aspects of it.”
Human sexuality students at Western Nevada College are required to masturbate, keep sex journals and write a term paper on their sexual histories, according to a federal lawsuit filed by a former student.
Karen Royce, charged the professor, Tom Kubistant, ignored her complaints that the class assignments invaded her privacy and amounted to sexual harassment.
“I raised my hand and said, “I don’t masturbate,” Royce told KRNV. “He said I had to do it at least three times in order to get a grade in the class.”
On the first day of class, Kubistant allegedly told students that he would “increase their sexual urges to such a height that they won’t be able to think about anything other than sex.”
Although the class included high-schoolers taking the course for college credit, Kubistant told students to list different types of sex and sexual positions, read the lists aloud and told students to turn in three 250-word journal entries on their sexual thoughts, Royce told ABC News.
Kubistant also introduced the final exam term paper, called “A Sexual Case Study … You!” in which students would be required to describe their sexual exploration, any sexual-abuse they had experienced, losing their virginity, cheating, arousal, climaxes and fetishes, to name a few of the assignment points.
Royce complained to the instructor and then to school authorities. “The investigator found no evidence to support the student’s complaint of sexual harassment,” (spokeswoman Anne) Hansen told ABC. “In fact, the investigator found that the instructor was considered to be an excellent and caring professor, who, with the exception of that one student, appeared to be universally admired by other students who had taken the course.”
Royce, who’d returned to college to pursue a career as a social worker, believed the human sexuality course would help her professionally. She dropped the class after four sessions.
The U.S. Education Department’s plan to include part-time and transfer students in community college success rates is a major step forward, writes Thomas Bailey, who chaired the Committee on the Measure of Student Success and directs the Community College Research Center at Teachers College, Columbia. However, the new measures still won’t answer important questions about student success.
The plan will clarify who counts as a “degree-seeking” student and “improve the collection and analysis of data on students who receive federal financial aid,” Bailey notes. It also calls for improved state data systems to track students over time.
However, the federal action plan calculates a “graduation rate” that includes both students who earned a degree and those who transferred without graduating. The two outcomes should not be lumped together, the CMSS recommended. “Transfer is a key outcome for community-college students, but it is not the same thing as graduating,” Bailey writes.
The Department of Education also rejected the CMSS’s suggestion that colleges disaggregate outcomes for community-college students who are deemed ill-prepared for college-level work and are therefore assigned to remedial education. While this might be difficult for colleges to do, it is important—not least because so many students fall into this category. The action plan should recognize the need to develop better information about the success of these students.
Many questions about student outcomes will not be answered by the new measures, Bailey predicts. To really understand student success, we’d need “a data system that would allow us to track individual students over time as they move around the country and among institutions.” Because of privacy concerns, this is a controversial idea. But without individual tracking, “our measures of success will remain frustratingly incomplete.”
Are you gay? California’s state colleges and universities will ask students about their sexual orientation next year on application or enrollment forms, reports the Los Angeles Times. Students wouldn’t be required to answer.
A state law encourages community colleges and state universities to determine the size of lesbian, gay, bisexual and transgender (LGBT) populations and evaluate whether they’re offering enough services, such as counseling.
In a 2010 University of California survey, 87% of students defined themselves as heterosexual, 3% as gay/lesbian or “self-identified queer,” 3% as bisexual, and 1% as “questioning” or unsure, and others didn’t respond.
Nobody knows the college graduation rate because we’re unwilling to track individual students, writes Andrew Gillen, research director for the Center for College Affordability and Productivity. Federal data counts only first-time, full-time students and counts most transfers as dropouts, even though 38 percent of students are part-timers, one third transfer and others drop out and back in.
The fact that we spend hundreds of billions of taxpayer dollars on higher education and can’t determine something as basic as a national graduation rate is a dereliction of duty.
Student Unit Records — databases that assign each student an individual number — would make it possible to calculate an accurate, meaningful graduation rate, Gillen writes.
Matching educational records from a SUR with earnings data from the IRS would allow for accurate employment outcomes to be published for each college and program. Such information would help students make better decisions which would in turn help discipline and focus colleges.
Bad colleges oppose SURs: Accurate data would take away their excuses, writes Gillen. Good colleges are opposed too, “terrified of being compared to other schools on something like value-added earnings.”
Privacy advocates also oppose tracking students, but “convincing methods of safeguarding privacy while implementing a SUR have been developed, Gillen writes.