What Really Drives Tuition

We often hear, from faculty, higher education pundits, and just from random people on the street, that the high cost of college is driven by spending money on college administration. We've all seen the stories about the million dollar lazy river on campus! And while there are a few of those (not paid for by tuition dollars, though), this is hardly the norm.    

In the Chronicle Pricing Survey that I conducted with the Chronicle of Higher Education, we asked college presidents and chief financial officers how important various expenditures and revenue streams were in determining undergraduate tuition this past year.  

About three out of four (74%) college leaders told us that the "cost of faculty" was either "extremely" or "very important" in determining tuition. This was the expenditure of highest concern for them.  

The cost of administration was of concern to many fewer college leaders, with only 44% telling us this was an "extremely" or "very important" consideration in setting tuition.

So, while administrative costs have an impact, it is quite a bit lower than that of the faculty. This is probably as it should be. But not how the issue is perceived.  

College tuition is not largely driven by administrative costs. So the next time you hear someone say other wise, remember that you have the data on your side.

So, how would that work, for higher education professionals? I turned to my friend Gavin Henning, past president of ACPA and snappy dresser, for a consultation. He suggested looking at it from the lens of staff level.

For entry and mid-level, Gavin said, this "helps to provide a counter narrative to what we hear in the news." Sharing this data with colleagues helps to provide data to their experience.

For the mid-level to senior-level staff, this information can be used in budget discussions. Student-affairs professional have student learning at heart, and many students and alumni tell us that their learning was greatly enhanced by their experiences outside the classroom. Student affairs professionals can use this information to stave off budget raids when others think that "administrative bloat" needs to be curtailed. 

Data always contributes to the story if you use it wisely. 

 

 

The Current Practice of College Tuition Discounting is Not Sustainable

In my last blog, I wrote about the practice of tuition discounting, and how college presidents have a very different understanding of how applicants and their families view tuition than they actually do. Tuition discounting is the widespread practice of setting the “sticker price” of tuition at a high level, but then offering financial aid to discount what people actually pay.  Hardly anyone pays full price. The average discount is about half the price in private colleges and universities.

And the rate at which schools discount is increasing every year. In the last ten years, it has increased about a percentage point a year, from 38.6% in 2006-2007 to 49.1% in 2016-2017 (figures are from the very well done NACUBO Tuition Discounting Study).

Economically, this annual increase is a disaster for colleges and universities, since it means less tuition revenue per student.

2017_CollegePricing_v4_i.jpg

The schools know this. In a survey that I created and conducted with The Chronicle of Higher Education, we found that four out of five college presidents and chief financial officers (CFOs) thought that the practice of tuition discounting was unsustainable. At private institutions, we found that among CFOs, the people most knowledgeable about institutional finances, 70% thought that tuition discounting was not sustainable at their own institution.

Clearly, something needs to change.

For more results on the College Pricing Survey, get the report from the Chronicle’s website.

The Tuition Pricing Crisis

We have a big problem with college tuition.  

Multiple analyses of college tuition have indicated that it has risen at rates higher than inflation, health care, housing, and a host of other items and services.

The inevitable question, then, is why is college so expensive?  And how are tuition prices determined?

Much of the research on tuition pricing looks at economic data, such as the relationship between tuition increases and decreases in state monies going to higher education. At the same time that we see public colleges receiving less money from the state, we also see that tuition at those institutions is rising. The conclusion then is that those responsible for setting tuition are making the decision to raise tuition to make up for decreasing state contributions.  

There are many other factors, however, that are more complicated.  And while the decisions around tuition pricing might be informed by economic factors, ultimately, they are made by people. 

I wanted to know what the people making the decisions were thinking.  

So, I worked with The Chronicle of Higher Education to develop a survey for college presidents and chief financial officers that would ask about their decision-making process. The results are available in a nice report on the Chronicle’s website.

I always think that when you have to solve a problem, it’s important to trace that problem back to assumptions that people have made.  They are not always right.  So, here’s a big disconnect we found.

College leaders’ assumptions about what prospective students and their families understand about college pricing are pretty much wrong.

Here’s the situation.  Many of you know this already, so bear with me for a minute.  

The real problem is that many people looking at colleges don’t know about this. 

Hardly anyone pays those high sticker prices for college tuition.

College leaders know that prices are too high for many to afford, and so after those high prices are published, the colleges throw a lot of financial aid at the problem.  A high sticker price is associated with quality.  Just like you assume that a $50 bottle of wine is much better than a $10 bottle of wine.  And it usually is, but not many people can spend $50 on a bottle of wine.  So, colleges discount tuition.  On average, they discount it by about half.  

That means that at a college with a sticker price of $30,000, the average amount that gets charged to students is $15,000 (of course it varies student by student and school by school, but this is about average).  

All of a sudden students and their families are paying only $15,000 a year for a $30,000 a year college education.  Great deal, right?  

Here’s the disconnect.

College leaders assume that prospective students and families know this, and take it into account when applying to schools.  We tested this in our survey and over half of college leaders thought people knew they would get a discount.  Three out of four college leaders at private not-for-profit institutions though that students knew they would not pay sticker price due to tuition discounts.  Many college leaders also thought sticker price would not cause prospective students from looking at a school.  

Other research indicates that this is just not the case, however.  A 2013 survey by Longmire and Company, Inc. indicated that: “Approximately 4 in 10 students and parents say they rejected colleges on the basis of their published sticker price alone. Six in ten say they are unaware that “colleges discount their published price so that incoming freshmen pay less than what is published.”   
   
This is a remarkable mismatch. Forty percent of potential students reject a school out of hand purely on sticker price.  Even more have no idea that tuition discounting ameliorates the sticker price.  This is a very large sector of the population that think they cannot afford an institution when it might be affordable.  The sticker price shocks them.  Yet the people setting that price think that what they are doing is well known.  

It’s not.  Obviously, this is one of the big problems with college tuition.  And there are more that are in the report I did with the Chronicle.  I’ll talk about a few more in subsequent blog posts, but you probably want to get that report.  It's got a lot of good information for college leaders as well as prospective students and their families. 

The Many Paths to an Education

Rainesford Stauffer "somewhat blindly" chose her college, she tells us in an opinion piece in The New York Times. When she arrived she seems to have done some of what we tell students to do to succeed. She joined clubs and took her studies seriously.  But she "struggled to conform to campus life."

She did not return after her first-year of college.  Unfortunately, this is not unusual, as about 4 out of 10 first-year students do the same. One in ten will go to another school that next fall.  That leaves 3 out of 10 trying to figure out what else to do. 

The young woman goes on to tell us about how she went to work, volunteered, and eventually obtained some college credit for what she learned in her experiences (what is called "prior learning credit') and graduated from college this past spring.  

But clearly there was a mismatch between her interest and that first college. And perhaps a key to that mismatch is applying "somewhat blindly" to that college.  This is too important a decision to do "somewhat blindly," but often times people make such decisions this way. Perhaps that is why recent research indicates that half of college alumni wish that they had gone to a different college, or had a different major, or got a different kind of degree. 

Rainesford felt like a failure when the expectation she (and others) had of her life did not come true.  But her story is really one of success, in which she finds joy in different careers and eventually gets that degree.  The sadness is that she felt like a failure.

What I want is for a few things to happen.  One is that we make it easier for potential students to pick a college that is right for them.  There is just too little information out there about what matters and how to pick a place that is right for you.  Another is that going straight from high school to college is not the only way to be successful in life.  

Take a year off and figure out what you want to do and why you want to do it.  A gap year can be a great experience that can focus your thoughts.  It is not just for the wealthy.  There are many ways to earn what you need during a gap year. 

Take local classes at a community college while working and taking time to figure out what you are interested in.

There are many paths to an education.  That is more true every day in this world.  More and more people are taking paths like Rainesford that involve combinations and working and learning.  And at 23, I bet she is not done yet.  

Higher education does not start, or stop, after graduating from high school.  Learning is a life-long and enriching activity.  I just signed up for my first online course this past weekend and am pretty excited about it.  

Education should be something that one is excited about.  If it's not, maybe that's a sign to try it another way.  There are many ways.

 

 

 

 

 

 

What is a Good Response Rate for Surveys?

"How can we get our survey response rates up?"

I'm asked this question a lot.  Everyone who conducts surveys wants more people to take their surveys.  But people, for the most part, don't seem to get excited about all the surveys we have for them.  With the ease of which a web survey can be created and implemented, more and more surveys are being launched.  And the more survey requests someone gets, the less likely it is that they will comply.  So response rates have been falling for years.  

There is an assumption that the higher the response rate is to the survey, the more accurate the results will be.  Oftentimes the first question I will get about a survey is about the response rate, because people have been conditioned to connect a low response rate with inaccurate results. This is most often used when someone does not like the results of the survey.  A low response rate lets people dismiss findings that they don't want to hear without actually having to deal with those findings. 

But (shhhh) nobody really knows what a good response rate is. 

I recently, however, came across a paper that addressed this in a great way.

"How Important are High Response Rates for College Surveys?" is a paper by Kevin Fosnacht, Shimon Sarraf, Elijah Howe, and Leah K. Peck all at Indiana University Bloomington that was published in The Review of Higher Education in the winter of 2017 (I've linked to a version of the paper you can access without a membership in any particular library source).  It is one of the best treatments of this issue I've read.

OK.  Now, think about web surveys.  You get an email asking you to take a survey.  Maybe you do that (thank you!), or maybe the email sits in your inbox and gets lower and lower until you forget about it.  A few days later you get another email asking you to take that survey.  But you are busy brushing your cat and there is hair everywhere and then after you clean that up there is that show you've been binge watching, and again you don't take the survey.  A few days later, there it is again, another reminder to take that survey.  But this time you aren't terribly busy with something else and it sounds kind of interesting, so you click on the link and take that fascinating survey.  

You've just gone from a non-respondent to a respondent.  If I hadn't sent you that second reminder (you knew it was me sending you that survey, didn't you? It's kind of what I do), you would not have taken it.  There is a direct relationship between the effort the researchers take to get people to respond (e.g., number of emails, offering incentives, leaving the survey open for longer times, etc.) and how many do respond. As the person administering the web survey, I also know when you completed it.  So I can tell if you did it right away after that first email (low effort), or if you only did it because I emailed you three times (high effort).

We can compare the results of the early respondents with those who are later respondents.  So, what are the results when we only had, say, 30% of the respondents answer?  Let's add in the next wave of respondents and see if the results change when we have 50% answering. This tells us if the results change (and maybe get more accurate?)  as a function of the response rates.  

The authors of this particular study took a large database of responses from multiple institutions to the National Survey of Student Engagement (NSSE) to test this and, using the timing of the response, simulated surveys with various different response rates.  They could then see what happened if a survey had, say a 10% versus a 90% response rate.  Which is very cool.  

And what they found was that response rate didn't really matter much.

The reliability of the results was pretty good whether you used a 5% response rate or a 75% response rate.  In fact, what mattered more was how many respondents you had, with smaller numbers being less reliable than larger numbers.  Their study showed that reliable data could be obtained with a sampling frame of 500 even when the response rate was as low as 5% to 10%.  Obviously the numbers of respondents that you get should also be influenced by the types of analyses you want to conduct, especially if you want to look at subgroups of students (for instance, men versus women).  

One study should not overly influence our practices, but the authors also provide evidence of similar findings from other researchers.  I definitely recommend that you read their paper, as there as nuances in there that a blog cannot capture.  But perhaps we are seeing a move away from the tyranny of the response rate.   

 

More on Parental Communication in the First Year of College

Devoted readers will recall that my last blog post was prompted by a paper by Sax & Weintraub on patterns of communication between first-year college students and their parents.  After I'd written it I realized one thing was nagging me about the results, and so I contacted my friend and colleague Linda Sax, who was one of the authors.  I was struck that the article did not discuss if there had been different patterns of communication with mothers and/or fathers for male and female students, and so I asked.

Linda told me that this was actually the second paper they had written from this set of data, and that indeed the first paper (which she kindly sent me) had published results showing that yes, indeed, there were differences.

As one might expect, communication while students were in school was most frequent between mothers and daughters, and then between mothers and sons.  For fathers, the pattern was reversed: men were slightly more likely to communicate with fathers than mothers.  Many students were satisfied with the amount of contact they had, although 46% of women and 33% of men wanted more contact with their fathers.

The desire to want more communication with fathers is associated with declines in emotional wellbeing from entering college until the spring of freshman year.  While it is difficult to use this data to definitely say that lower levels of communication with fathers leads to lower levels of emotional wellbeing, this is certainly a good possibility.  

In the other paper I wrote about by Sax & Weintraub we saw similar results with lower levels of communication with fathers associated with lower levels of academic adjustment. 

Again, as parents we need to remember that our job is not done once that last box is moved from the car to the dorm room.  Keeping lines of communication open is key in helping students be successful in that crucial first year of college. 

Parental Communication in the First Year of College

On a rare rainy day in Los Angeles, the latest issue of the Journal of the First-Year Experience & Students in Transition arrived via the U.S. Postal Service, and I took the opportunity to sit down and have a good read.  It turned out that my friend and colleague Linda Sax, from whom I had inherited the position of director of the Cooperative Institutional Research Program, had a paper in this issue with her collaborator, Danya Weintraub. In this paper they examined the role of parents in first-year students' college adjustment.

AdobeStock_54063575.jpeg

In this study, the authors used data from the CIRP Freshman Survey, filled out during orientation, and combined it with data from these same students on a follow-up survey the next spring.  All the students were attending a public university in the western United States. So the authors had two different points in time: at the beginning and towards the end of freshman year. 

The authors cited work I had done a few years ago, also with the CIRP Freshman Survey, with some questions I'd added on parental involvement in college-related decisions.  At the time, the "helicopter parent" was a popular term to describe parents that swooped down from above and intervened in their child's life.  The thinking was that this overparenting was creating young adults that were not getting the necessary experience with solving their own problems, as well as not being used to dealing with failure.  We were interested in finding out how students felt about their parents being involved in decisions such as where to go to college and what courses to take.  What we found was that to a large extent incoming freshmen thought their parents were involved the "right amount," and not too much or too little.

Getting back to Linda and Danya's article, they asked these freshmen, in the spring of their freshman year, about how often they communicated with their parents.  What they found was that students were in contact with mothers more than fathers, and that they mostly talked on the phone.  Texting was secondary to phone calls, and email was next.  This was in 2012, and my guess is that in 2016 texting might have taken over for phone calls.

While the more frequent communication with mothers was seen as just the right amount by 72% of students, the less frequent communication with fathers was just the right amount for only 55%.  One third, or 33%, of students reported that the level of communication with fathers was less that they would like.  

The authors then went a step further to see if parent-child interactions had a relationship with adjustment to college.  Those students who sought out their fathers for social and emotional support had stronger academic adjustment to college.  Those students who wanted more contact with their fathers than they had showed weaker academic adjustment. For fathers who are inclined to let mothers take the lead in communication, this is a wake up call.  We Dads need to be sure to keep lines of communication open.

Mothers are important too.  Academic adjustment was better for students who felt that they had high quality communication with their mothers.

When I dropped my daughter off at college this fall the school had a two-hour session for parents that was basically about letting go.  Parental communication was encouraged, but at the pace that the student initiates.  Not at helicopter parent rates.  I would add that it is important to check in with your student to make sure that they feel they are getting the support they need at the rate that they feel is best.    

   

What Parents Need To Know

I did something new this week.  Usually I speak to college faculty, administrators, researchers, or policy wonks about my research.  But having, once again, come out of the college admissions process alive and witnessing another excellent choice by one of my children, I had an idea to do something new.  Talk to parents of high-school students.

Choosing to go to college isn't easy.  There are a lot of choices and it is a big financial commitment.  There are also a lot of misperceptions out there.  I saw that in visiting colleges with my children and in talking with parents of their friends.  I've been in higher education for over 25 years and there were still things I didn't know.  So I decided to try and help, and spent about three months (off and on) putting together a talk that tries to convey some helpful information in what can be a time of great stress.   I want to reduce that stress for parents and students. 

 I approached Notre Dame High School's Counseling Department (both my son and daughter graduated from there and had excellent college counseling ) with the idea: would this be of interest?  Turns out it was, and on Tuesday night I spoke in front of a standing-room only crowd. 

I was interested in how well I predicted what would be interesting to parents.  Sure enough, the whole idea of tuition discounting was the one that shook the room.  I used the Department of Education's College Scorecard to demonstrate this.  Only a handful of parents knew of this resource.  One of the parents that came up to me after the talk told me he immediately got on his phone and looked up all the schools his child was interested in.

Here are the main categories I discussed:

Why go to college?
Does it matter where you go?
What does college really cost?
Will my child have crippling debt?
How do you do college right?

My goal with this is to make the process less stressful for parents, and then hopefully also less stressful for their children.  From what I heard Tuesday night, it might have worked.    

More On College Rankings

In today's Sunday New York Times, Frank Bruni has excellent advice for how one should use (and not use) college rankings.  

There are two schools of thought on rankings.  Those of us who have looked into how they are created are usually siding with Mr. Bruni: that 1) the premise is flawed to begin with and 2) no one ranking is going to be definitive for all prospective students.

I've written previously on how a multirank system that could be customized by the user would be a better system that the static rankings we have now.  There are a wide variety of rankings that all decide for the user what is important for them to look for in a college.  Having had two children go through the college search process, I can attest that there is no one size fits all.  As Mr. Bruni suggests, if one has to use rankings, then use several to illustrate different sides of a school.  For instance, if community service is of interest, use the Washington Monthly rankings to get an idea of some of the schools that foster such interests.

But, as I am also on record pointing out, the rankings aren't really used as much by prospective students, but more often by college presidents, trustees, and alumni.  The people creating the rankings know this, and some are not geared towards prospective students.

So refer to rankings if you need to.  But also do your own homework.  Rankings are like the old Cliffs Notes that would summarize plots of books for those students who didn't have the time or inclination to read the whole book.  You can get the gist of what happened, but you miss the nuances in the prose of what distinguishes a good book from a great book.  And don't we all want that great book?    

Learning About Contemplative Education

I was honored to have been asked to keynote at Naropa University's "Mindfulness, MOOCs and Money in Higher Education: Contemplative Possibilities and Promise" this past weekend at Naropa, bringing a higher education research viewpoint on how we are looking at new ways of defining and demonstrating success with our college graduates. 

It was indeed a delight to meet one of my fellow keynoter's, Laura Rendón, whose work I had long admired.  When I was director of CIRP, we took some of her validation concepts from the qualitative realm into the quantitative and created several validation constructs in the student surveys.

I came away with a new understanding of contemplative education, which encourages experiential learning and introspection in one's learning process.  I was struck how in many areas of education we see this in different ways.  In a sense, Sandy Astin's involvement theory is contemplative in nature, as it says that we get more from our education the more involved we are in it.  Work that I was involved in at Gallup showed the impact of internships that allowed one to connect that work environment and the academic learning in the classroom is surely a type of experiential learning.  Studies with the NSSE (National Survey of Student Engagement) also clearly show the benefits of being engaged in and out of the classroom.

And so it was the best type of experience.  I went there thinking I was going to teach them, but perhaps I ended up learning the most.