[House Hearing, 112 Congress]
[From the U.S. Government Publishing Office]




                               before the

                         AND WORKFORCE TRAINING

                         COMMITTEE ON EDUCATION
                           AND THE WORKFORCE

                     U.S. House of Representatives

                      ONE HUNDRED TWELFTH CONGRESS

                             SECOND SESSION




                           Serial No. 112-69


  Printed for the use of the Committee on Education and the Workforce


                   Available via the World Wide Web:
            Committee address: http://edworkforce.house.gov



75-856 PDF                WASHINGTON : 2012
For sale by the Superintendent of Documents, U.S. Government Printing 
Office Internet: bookstore.gpo.gov Phone: toll free (866) 512-1800; DC 
area (202) 512-1800 Fax: (202) 512-2104  Mail: Stop IDCC, Washington, DC 


                    JOHN KLINE, Minnesota, Chairman

Thomas E. Petri, Wisconsin           George Miller, California,
Howard P. ``Buck'' McKeon,             Senior Democratic Member
    California                       Dale E. Kildee, Michigan
Judy Biggert, Illinois               Robert E. Andrews, New Jersey
Todd Russell Platts, Pennsylvania    Robert C. ``Bobby'' Scott, 
Joe Wilson, South Carolina               Virginia
Virginia Foxx, North Carolina        Lynn C. Woolsey, California
Bob Goodlatte, Virginia              Ruben Hinojosa, Texas
Duncan Hunter, California            Carolyn McCarthy, New York
David P. Roe, Tennessee              John F. Tierney, Massachusetts
Glenn Thompson, Pennsylvania         Dennis J. Kucinich, Ohio
Tim Walberg, Michigan                Rush D. Holt, New Jersey
Scott DesJarlais, Tennessee          Susan A. Davis, California
Richard L. Hanna, New York           Raul M. Grijalva, Arizona
Todd Rokita, Indiana                 Timothy H. Bishop, New York
Larry Bucshon, Indiana               David Loebsack, Iowa
Trey Gowdy, South Carolina           Mazie K. Hirono, Hawaii
Lou Barletta, Pennsylvania           Jason Altmire, Pennsylvania
Kristi L. Noem, South Dakota         Marcia L. Fudge, Ohio
Martha Roby, Alabama
Joseph J. Heck, Nevada
Dennis A. Ross, Florida
Mike Kelly, Pennsylvania

                      Barrett Karr, Staff Director
                 Jody Calemine, Minority Staff Director


               VIRGINIA FOXX, North Carolina, Chairwoman

John Kline, Minnesota                Ruben Hinojosa, Texas
Thomas E. Petri, Wisconsin             Ranking Minority Member
Howard P. ``Buck'' McKeon,           John F. Tierney, Massachusetts
    California                       Timothy H. Bishop, New York
Judy Biggert, Illinois               Robert E. Andrews, New Jersey
Todd Russell Platts, Pennsylvania    Susan A. Davis, California
David P. Roe, Tennessee              Raul M. Grijalva, Arizona
Glenn Thompson, Pennsylvania         David Loebsack, Iowa
Richard L. Hanna, New York           George Miller, California
Larry Bucshon, Indiana               Jason Altmire, Pennsylvania
Lou Barletta, Pennsylvania
Joseph J. Heck, Nevada

                            C O N T E N T S


Hearing held on September 20, 2012...............................     1

Statement of Members:
    Foxx, Hon. Virginia, Chairwoman, Subcommittee on Higher 
      Education and Workforce Training...........................     1
        Prepared statement of....................................     2
    Hinojosa, Hon. Ruben, ranking minority member, Subcommittee 
      on Higher Education and Workforce Training.................     3
        Prepared statement of....................................     4

Statement of Witnesses:
    Cruz, Dr. Jose, vice president for higher education policy 
      and practice, the Education Trust..........................    19
        Prepared statement of....................................    21
    Fitzsimmons, Dr. Tracy, president, Shenandoah University, 
      Winchester, VA.............................................    27
        Prepared statement of....................................    28
    Hallmark, Dr. James, vice chancellor for academic affairs, 
      Texas A&M University System................................    14
        Prepared statement of....................................    16
    Schneider, Dr. Mark, vice president, American Institutes for 
      Research...................................................     6
        Prepared statement of....................................     8

Additional Submissions:
    Dr. Fitzsimmons' response to questions submitted for the 
      record.....................................................    55
    Mrs. Foxx, questions submitted for the record to:
        Dr. Fitzsimmons..........................................    54
        Dr. Hallmark.............................................    58
        Dr. Schneider............................................    61
    Dr. Hallmark's response to questions submitted for the record    59
    Dr. Schneider's response to questions submitted for the 
      record.....................................................    62



                      Thursday, September 20, 2012

                     U.S. House of Representatives

        Subcommittee on Higher Education and Workforce Training

                Committee on Education and the Workforce

                             Washington, DC


    The subcommittee met, pursuant to call, at 10:08 a.m., in 
room 2175, Rayburn House Office Building, Hon. Virginia Foxx 
[chairwoman of the subcommittee] presiding.
    Present: Representatives Foxx, Kline, Petri, Platts, 
Hinojosa, Andrews, Davis, and Altmire.
    Staff present: Katherine Bathgate, Deputy Press Secretary; 
Adam Bennot, Press Assistant; James Bergeron, Director of 
Education and Human Services Policy; Casey Buboltz, Coalitions 
and Member Services Coordinator; Heather Couri, Deputy Director 
of Education and Human Services Policy; Cristin Datch, 
Professional Staff Member; Amy Raaf Jones, Education Policy 
Counsel and Senior Advisor; Barrett Karr, Staff Director; 
Krisann Pearce, General Counsel; Dan Shorts, Legislative 
Assistant; Alex Sollberger, Communications Director; Alissa 
Strawcutter, Deputy Clerk; Kate Ahlgren, Minority Investigative 
Counsel; Tylease Alli, Minority Clerk; Meg Benner, Minority 
Education Policy Advisor; Kelly Broughan, Minority Staff 
Assistant; Brian Levin, Minority New Media Press Assistant; 
Megan O'Reilly, Minority General Counsel; Julie Peller, 
Minority Deputy Staff Director; and Michael Zola, Minority 
Senior Counsel.
    Chairwoman Foxx. Good morning. A quorum being present, the 
subcommittee will come to order. Welcome to the--to today's 
subcommittee hearing. Thanks to our witnesses for joining us to 
discuss the strengths and weaknesses of the Federal Higher 
Education Data Collection System. That is a mouthful, is it 
    The 2008 reauthorization of the Higher Education Act 
included several provisions aimed at improving transparency in 
higher education. For the first time institutions were required 
to make information about higher education pricing and 
financial aid more readily available to students and families. 
Additionally, the reauthorization encouraged colleges and 
universities to provide the federal government with more 
information about basic institutional characteristics such as 
demographics and graduation rates to help students make well-
informed higher education choices.
    At the time, then Ranking Member Buck McKeon said the 
legislation would help our nation's higher education system 
``begin a transformation that will make it more accessible, 
affordable and accountable to consumers.''
    Without a doubt, the most recent reauthorization of the 
Higher Education Act started a process of enhancing higher 
education transparency. But as tuition and student debt 
continue to rise, and at an astonishing pace, it is clear more 
work must be done to help students and families make informed 
choices about their higher education options without 
overburdening institutions with counterproductive red tape.
    The Obama administration has recently suggested a need to 
make more data available to help perspective students and 
families better understand their post-secondary education 
options, as well as the financial commitment required by the 
schools they are considering. However, there is concern that 
newer additional data requirements could be duplicative or 
unnecessarily burdensome to higher education institutions. 
After all, the nation's 7,000 post-secondary education 
institutions already dedicate thousands of hours and millions 
of dollars on data reporting each year.
    In the 2011-2012 academic year, institutions spent roughly 
800,000 and more than $28 million filling out surveys for just 
one of the Department of Education's five main higher education 
databases. Experts predict the burden will grow to 850,000 
hours and $31 million in the 2012-2013 school year. Again, 
these numbers reflect just a portion of the federal reporting 
requirements currently imposed on our higher education 
institutions. One can only assume the total investment in 
federal data collection is much greater.
    Adding insult to injury, institutions may also be asked to 
submit additional data to creditors and state leaders. This 
information often differs from the federal requirements, adding 
to the burden facing the nation's post-secondary schools.
    As I previously stated, those in Washington have a 
responsibility to weigh carefully any federal action to ensure 
that such actions will not create greater costs for students in 
schools, particularly in these tough economic times. In the 
next Congress this committee will be responsible for leading 
the charge once again to reauthorize the Higher Education Act. 
Today's hearing will allow us an opportunity to review the 
types of higher education data currently collected by the 
federal government, and discuss whether this information is 
useful to families, institutions and taxpayers.
    We are fortunate today to have several expert witnesses 
with us who can offer their perspectives on data reporting. And 
I expect their thoughts will inform future discussions on the 
reauthorization of the Higher Education Act.
    With that, I would like to now recognize the ranking 
member, Mr. Hinojosa, for his opening remarks.
    [The statement of Ms. Foxx follows:]

         Prepared Statement of Hon. Virginia Foxx, Chairwoman,
        Subcommittee on Higher Education and Workforce Training

    The 2008 reauthorization of the Higher Education Act included 
several provisions aimed at improving transparency in higher education. 
For the first time, institutions were required to make information 
about higher education pricing and financial aid more readily available 
to students and families. Additionally, the reauthorization encouraged 
colleges and universities to provide the federal government with more 
information about basic institutional characteristics, such as 
demographics and graduation rates, to help students make well-informed 
higher education choices.
    At the time, then-Ranking Member Buck McKeon said the legislation 
would help our nation's higher education system ``begin a 
transformation that will make it more accessible, affordable, and 
accountable to consumers.'' Without a doubt, the most recent 
reauthorization of the Higher Education Act started a process of 
enhancing higher education transparency. But as tuition continues to 
rise at an astonishing pace, it is clear more work must be done to help 
students and families make informed choices about their higher 
education options without overburdening institutions with 
counterproductive red tape.
    The Obama administration has recently suggested a need to make more 
data available to help prospective students and families better 
understand their postsecondary education options as well as the 
financial commitment required by the schools they're considering. 
However, there is concern that new or additional data requirements 
could be duplicative or unnecessarily burdensome to higher education 
    After all, the nation's 7,000 postsecondary education institutions 
already dedicate thousands of hours and millions of dollars on data 
reporting each year. In the 2011-2012 academic year, institutions spent 
roughly 800,000 hours and more than $28 million filling out surveys for 
just one of the Department of Education's five main higher education 
databases. Experts predict the burden will grow to 850,000 hours and 
$31 million in the 2012-2013 school year.
    Again, these numbers reflect just a portion of the federal 
reporting requirements currently leveraged on our higher education 
institutions. One can only assume the total investment in federal data 
collection is much greater. Adding insult to injury, institutions may 
also be asked to submit additional data to accreditors and state 
leaders. This information often differs from the federal requirements, 
adding to the burden facing the nation's postsecondary schools.
    As I have previously stated, those in Washington have a 
responsibility to weigh carefully any federal action to ensure that 
such actions will not create greater costs for students and schools, 
particularly in these tough economic times. In the next Congress, this 
committee will be responsible for leading the charge once again to 
reauthorize the Higher Education Act. Today's hearing will allow us an 
opportunity to review the types of higher education data currently 
collected by the federal government and discuss whether this 
information is useful to families, institutions, and taxpayers.
    We are fortunate today to have several expert witnesses with us who 
can offer their perspectives on data reporting, and I expect their 
thoughts will inform future discussions on the reauthorization of the 
Higher Education Act.
    Mr. Hinojosa. Thank you, Chairwoman Foxx. I appreciate 
everything that you have done to make this hearing possible. 
And I think it will be very, very productive; very informative.
    I would like to thank our distinguished panel of witnesses 
for joining us today to examine the usefulness and quality of 
the data the federal government currently collects from 
institutions of higher education. It is my hope that our expert 
panel will provide us with a better understanding of the data 
reported, how students and families and policymakers use the 
data, and key areas that could be improved when this committee 
reauthorizes the Higher Education Act.
    Data on post-secondary education are critical for a number 
of reasons. As the ranking member of this subcommittee, I 
believe that these data can help students and families to make 
informed decisions on which institution best meets their unique 
needs. Data on post-secondary education can provide colleges 
and universities the information they need to improve teaching 
and learning, the quality of education programs and student 
success, particularly for minority and low-income first 
generation college students and non-traditional students.
    In my view, most policymakers find data on post-secondary 
education extremely valuable. Both federal and non-federal 
reporting and disclosure requirements, for example, are 
intended to hold colleges and universities accountable for 
rising tuition and the quality of educational programs. On 
behalf of students and families and taxpayers, we in Congress 
must ensure that the large financial investment that the 
federal government has made in higher education is making a 
difference for students and families.
    While we must be aware of the administrative burden that 
data collection and disclosure and reporting requirements 
impose on institutions of higher education, we must collect 
data that allow the federal government to monitor the use of 
Title IV financial aid dollars and empower students and 
families to make informed choices.
    In terms of the types of data that the federal government 
collects, I want to highlight one of the shortcomings of our 
current system. And I have said this for the 16 years that I 
have been in Congress. As you know, the federal government 
requires institutions to report college completion for first 
time, full-time students--only 14.6 percent of students 
enrolled in post-secondary coursework.
    As a result, current data do not reflect the increase of 
non-traditional students enrolling in our institutions. This is 
clearly a poor and inaccurate measure of how colleges and 
universities are serving all those students I enumerated.
    By collecting data on all students enrolled, including 
part-time students, colleges and universities and the federal 
government would have a more accurate picture of a student's 
academic progress and success. Importantly, as the composition 
and needs of post-secondary education students change and 
become increasingly diverse, institutions must not abandon 
their commitment to educate greater numbers of low-income and 
minority students.
    In fact, I believe that we must do more to incentivize 
institutions that expand educational opportunity to some of our 
most disadvantaged student populations. The federal government 
has a responsibility to ask clear questions, to collect 
relevant data and to provide helpful information to students 
and families making the important decision of what college to 
    And in closing I want to say that we have a lot to learn. 
And current systems could be greatly improved to reflect the 
diversity of today's higher education population. I hope 
today's panel will inform this committee so we can better 
understand how to improve the data we collect and what 
information is most useful for all of our students and 
    And with that, Madam Chair, I thank you for calling this 
congressional hearing. I yield back.
    [The statement of Mr. Hinojosa follows:]

Prepared Statement of Hon. Ruben Hinojosa, Ranking Member, Subcommittee 
               on Higher Education and Workforce Training

    Thank you, Chairwoman Foxx.
    I would like to thank our distinguished panel of witnesses for 
joining us today to examine the usefulness and quality of the data the 
federal government currently collects from institutions of higher 
    It is my hope that our expert panel will provide us with a better 
understanding of the data reported; how students, families and 
policymakers use the data; and key areas that could be improved when 
this committee reauthorizes the Higher education Act.
    Data on postsecondary education are critical for a number of 
reasons. As the Ranking member of this subcommittee, I believe that 
these data can help students and families to make informed decisions on 
which institution best meets their unique needs.
    Data on postsecondary education can provide colleges and 
universities with the information they need to improve teaching and 
learning, the quality of educational programs, and student success, 
particularly for minority, low-income, first-generation college 
students, and non-traditional students.
    In my view, most policymakers find data on postsecondary education 
extremely valuable. Both federal and non-federal reporting and 
disclosure requirements, for example, are intended to hold colleges and 
universities accountable for rising tuition and the quality of 
educational programs.
    On behalf of students, families and taxpayers, we in Congress, must 
ensure that the large financial investments that the federal government 
has made in higher education are making a difference for students and 
    While we must be aware of the administrative burden that data 
collection, and disclosure and reporting requirements impose on 
institutions of higher education, we must collect data that allows the 
federal government to monitor the use of Title IV financial aid dollars 
and empowers students and families to make informed choices.
    In terms of the types of data that the federal government collects, 
I want to highlight one of the shortcomings of our current system.
    As you know, the federal government requires institutions to report 
college completion for first-time, full-time students, only 14.6 
percent of students enrolled in postsecondary coursework. As a result, 
current data do not reflect the increase of non-traditional students 
enrolling in our institutions.
    This is clearly a poor and inaccurate measure of how colleges and 
universities are serving all students. By collecting data on all 
students enrolled, including part-time students, colleges and 
universities and the federal government would have a more accurate 
picture of a student's academic progress and success.
    Importantly, as the composition and needs of postsecondary students 
change and become increasingly diverse, institutions must not abandon 
their commitment to educate greater numbers of low-income and minority 
students. In fact, I believe that we must do more to incentivize 
institutions that expand educational opportunity to some of our most 
disadvantaged student populations.
    The Federal Government has a responsibility to ask clear questions, 
collect relevant data, and provide helpful information to students and 
families making the important decision of what college to attend. We 
have a lot to learn, and current systems could be greatly improved to 
reflect the diversity of today's higher education population.
    I hope today's panel will inform this Committee so we can better 
understand how to improve the data we collect, and what information is 
most useful for all of our students and families.
    With that, I yield back to Chairwoman Foxx and our distinguished 
panel of experts.
    Chairwoman Foxx. Thank you very much, Mr. Hinojosa. I will 
make one comment on what you have said. This idea of 
traditional and non-traditional students, I think we are going 
to hear a lot more about today. And I have been mulling over 
that term traditional student. I think we have to get rid of 
that because it is obviously inappropriate. I--yesterday I was 
thinking about that, and I thought maybe we need to use the 
term old-fashioned student. I have tried to think of a word for 
it. But anyway, we will talk some more. But we certainly agree 
that that is an area where there is a tremendous problem.
    Pursuant to committee Rule 7(c), all subcommittee members 
will be permitted to submit written statements to be included 
in the permanent hearing record. And without objection, the 
hearing record will remain open for 14 days to allow 
statements, questions for the record and other extraneous 
material referenced during the hearing to be submitted in the 
official hearing record.
    It is now my pleasure to introduce our distinguished panel 
of witnesses. Dr. Mark Schneider is vice president for 
Education, Human Development and Workforce at the American 
Institutes for Research. He served as the U.S. Commissioner of 
Education Statistics from 2005 to 2008. He is also a visiting 
scholar at the American Enterprise Institute, and distinguished 
professor emeritus of political science at the State University 
of New York Stony Brook.
    Dr. James Hallmark is the vice chancellor for Academic 
Affairs of the Texas A&M University System. He began his career 
as an instructor of speech communication before proceeding 
through a series of positions at West Texas A&M, including dean 
of the Graduate School in research, federal relations 
coordinator and provost/president for Academic Affairs. He 
became vice chancellor in 2012.
    Dr. Jose Cruz is the vice president for Higher Education 
Policy and Practice at The Education Trust. He oversees the 
National Access to Success Initiative. Dr. Cruz is former vice 
president for Student Affairs at the University of Puerto Rico 
    Dr. Tracy Fitzsimmons has been president of Shenandoah 
University since 2008. She holds a faculty appointment as 
professor of political science. Additionally she serves on the 
board of the National Association of Independent Colleges and 
Universities in the Powhatan School.
    Before I recognize you to provide your testimony, let me 
briefly explain our lighting system. You will have 5 minutes to 
present your testimony. When you begin, the light in front of 
you will turn green. When 1 minute is left, the light will turn 
yellow. And when your time is expired, the light will turn red, 
at which point I ask that you wrap up your remarks as best as 
you are able. After you have testified, members will each have 
5 minutes to ask questions of the panel.
    I would now like--I now recognize Dr. Schneider for 5 


    Mr. Schneider. Thank you so much. It is my pleasure to be 
    So, I just want to echo some of the comments that have 
already been made in the fact that the nation spends billions 
upon billions of dollars in our higher education system. And 
actually that is taxpayer money. When we think about the 
investment that students and their families make in this we are 
talking about hundreds of billions of dollars. It is poured 
into a system that we sometimes like to think about as the best 
in the world.
    We actually have many, many world-class universities. But 
we also have hundreds upon hundreds of colleges that are not 
doing the job in terms of educating students, graduating 
students and helping them find employment after they graduate, 
which by the way is over 90 percent of American students now 
say that is job number one for the colleges that they attend.
    So, we may not have the best university system in the 
world. But we certainly have the most expensive. According to 
OECD figures, we spend more than any other OECD country and in 
fact twice as much as the OECD average. Despite all that money 
that we spend, we do not know which institutions are spending 
their money efficiently, and we do not know which universities 
are actually doing a good job in turning--in terms of 
returning--a higher return on the investment made by taxpayers 
or by students.
    So, one of the problems that we have is that our data 
system, again as referred to, is actually pretty bad. And I am 
referring specifically to IPEDS, which is the nation's number 
one system of data collection for post-secondary systems of 
education. And actually IPEDS would have been a wonderful 
system in the 1950s, but it is not appropriate for today.
    It does not work for the students that we have. It does not 
work for the enrollment patterns that we have. It does not work 
for the institutions that we have. And again, the fixation on 
first-time, full-time students is just--it is just crazy. It 
makes no sense at all, given the world that we live in.
    So, I just want to quickly note some of the areas in which 
we are making some progress and the areas in which I think need 
to be addressed by this committee and the Congress going 
forward, and hopefully be addressed in the reauthorization of 
    First of all, student success while in college. IPEDS 
measures far too few students, measures far too few aspects of 
student success while in college; for example, no measures on 
student progression. We are making progress fixing that.
    In particular I am talking about the NGA, the National 
Governors Association and Complete College America's metrics 
which I believe, a, are much more encompassing and accurate 
than IPEDS; and b, show that in fact these data can be 
collected without a heavy burden on institutions. And again, I 
am very mindful that we want to be really careful about 
imposing additional burdens on campus. But I believe that CCA 
and NGA are showing the way forward on that.
    A second area that I think we really need to be much more 
careful about and thoughtful about is the labor market returns 
to graduation. Students in the nation need to know what fields, 
what schools, what programs are graduating students that are 
having success in the local labor market.
    This is critical information for the wellbeing of all of 
us. We need to be able to link the wage data with the Student 
Unit Record data. States can now do this. Over 30 states have 
now linked their Student Unit Records and unemployment 
insurance data, wage data. And most of this--most of those 
linkages were paid for by federal tax money.
    However, even though over 30 states can do this, the number 
of states that have made this data public is close to zero. So, 
one of the questions that I pose to all of us is what is the 
return on hundreds of millions of dollars of federal investment 
in these linked data, and why are not these data more in the 
public domain? This is a part of my life right now. But I think 
it is an important issue.
    And I think the third area that we really, really need a 
lot of work on is the cost to degree. How much does all this 
cost? We are very expensive. But our accounting systems are 
really rudimentary. So, for example, taxpayer subsidies come in 
so many different forms, and we are not tracking them 
    So, as a result, we can say something--a degree is cheap 
because the tuition is low. But when you take in all the 
taxpayer subsidies and you look, and you standardize by measure 
of success, the fact is that something that looks cheap could 
be really, real expensive. We need much, much better finance 
    I believe that we could address some of these issues in the 
reauthorization of HEA. I think, for example, that IPEDS can 
and must be approved--improved. I think that this is on 
Congress actually to identify the things that it really wants 
because most of the things at IPEDS are the result of 
    So, we need to figure out what still matters and what is 
still good. And I think we need to pay a lot of attention to 
ultimately the labor market success of students because I think 
that matters to all of us. Thank you.
    [The statement of Mr. Schneider follows:]

       Prepared Statement of Dr. Mark Schneider, Vice President,
                    American Institutes for Research

    The nation invests untold billions of taxpayer dollars in its 
higher education system. Students and their families pour even more 
into a system that often is thought of as ``the best in the world.'' 
While clearly the nation has the lion's share of the world's great 
universities, we also support hundreds upon hundreds of campuses that 
are not doing a good enough job of educating their students, graduating 
them, or helping them find jobs--which, according to a recent study by 
the Higher Education Research Institute of California, is the number 1 
goal of today's college students.
    Further, we have only rudimentary knowledge about how well all 
those billions are being spent. We do know that the United States 
spends more on higher education than any other nation in the 
Organization of Economic Cooperation and Development (OECD),\1\ but we 
have only limited insights into which institutions are spending their 
money more efficiently than others and which are generating a higher 
return on investment for students and taxpayers.
    \1\ According to the OECD's 2012 Education at A Glance, the United 
States spends around $29,000 per higher education student compared to 
the OECD average of $13,728.
    Our inability to document student and institutional success all too 
often traces back to limits in the nation's primary system of higher 
education data collection, the Integrated Postsecondary Education Data 
System (IPEDS).
    IPEDS would be a pretty good data system for the 1950s, but IPEDS 
is flawed--perhaps fatally so--given our current system of higher 
     When it comes to students, its coverage is too limited to 
represent the changing population of students enrolled in America's 
colleges and universities.
     When it comes to capturing different aspects of student 
success in college, IPEDS measures far too few.
     When it comes to the crucial issue of how much higher 
education costs, IPEDS comes up short. Yes, we can use IPEDS data to 
tease out some rudimentary information about costs (thanks largely to 
the Delta Project started by Jane Wellman and now at the American 
Institutes for Research, where I work). But these insights don't begin 
to meet our information needs.
     And when it comes to measuring taxpayers' return on the 
investment (ROI), we have to make some heroic assumptions to even 
approximate what taxpayers get in return for the vast sums they invest 
in colleges and universities.
    The nation can do better.
    With that in mind, I will sketch some of the metrics needed to 
better measure the performance of our colleges and universities. I'll 
use four categories to keep it simple:
     Student success while in college
     Student learning outcomes
     Student success in the labor market
     Costs of degrees
    I will zero in on what I see as some of the most promising 
developments in each category and discuss some of their benefits and 
costs. Then I'll take on the issue of risk adjustment to allow 
comparisons across institutions that serve different student 
populations. I'll end by comparing the present regulatory mentality of 
the US Department of Education's approach to measuring student success 
in the labor market with a consumer information approach that I believe 
works better with the data we have. A consumer-oriented approach could 
make it easier to find and use not only data on employment outcomes, 
but other types of information on college performance as well.
Student Success While in College
    Improving student success in college requires addressing three 
related processes: retention, progression, and completion. To earn a 
degree or a certificate, students have to stay enrolled (retention), 
they have to accumulate enough credits in a timely way (progression), 
and ultimately they have to finish school (graduation). We need far 
better measures of all three processes and we need to track far more 
students than we do now.
    As is well known, IPEDS concentrates on full-time, first-time 
beginning students. Unfortunately, this group represents fewer than 
half of all students in the country. And even for these students, 
IPEDS' measures of student success are limited.
    While IPEDS does report first-year retention rates for both full-
time and part-time students, it doesn't tell us the rates at which 
students stay in school after their first year, it has no information 
on student progression, it doesn't count most transfer students, it 
doesn't calculate student success metrics for many groups of students 
that are central to the nation's policy concerns (such as recipients of 
Pell grants) and has no information at all about student success after 
    Slowly (and, we must hope, surely), we are making progress on 
fixing these problems. Most notably, the National Governors Association 
is leading states to endorse Complete College America's (CCA) student 
success metrics, which will allow us to more accurately measure the 
success of far more students enrolled in colleges and universities than 
is possible with IPEDS. That's because these metrics are based on 
student-level data (held by the states, not the federal government), 
data that are much finer grained and more accurate and that cover more 
students than IPEDS.
    One area of student success that CCA emphasizes is credit 
accumulation--an intermediate step between retention and completion. 
The aim of this measurement is to determine the proportion of 
undergraduates making steady academic progress during an academic year. 
Students can return semester-after-semester, but if they aren't 
completing courses and earning credits at a pace that will allow them 
to get a bachelor's degree within 6-8 years or an associate's degree in 
around 4 or so years, many will likely never graduate. Capturing the 
percentage of students who are progressing fast enough toward their 
degree is one measure to which IPEDS needs pay far more attention.
    These kinds of student success measures are built on student-level 
data that most campuses and states should have and that can be compiled 
both relatively quickly and cheaply. Moreover, they can be produced 
now, without a long lead time. In turn, I believe that we can vastly 
improve our measurement of student success without imposing undue 
burden on states or campuses--something about which we all need to be 
    Despite its importance, the CCA effort isn't broad enough. Yes, 
over half the states in the nation now provide Complete College America 
with expanded metrics, but these cover only public institutions and 
currently the data are not reported at the campus or program level.
Student Learning
    Higher education is about just that: educating students. However, 
the task of actually measuring how much college students have learned 
is just beginning to gain traction.
    Critics have long suspected that far too many colleges are not 
improving student skills. Richard Arum and Josipa Roksa's book, 
Academically Adrift, elevated that concern from faculty office 
anecdotes to a headline issue.
    Arum and Roksa show that during their first two years of college, 
almost half of the students in their study did not improve in critical 
thinking, complex reasoning, or writing. Moreover, they show that 
students are distracted by socializing or working and that many 
colleges and universities put undergraduate learning close to the 
bottom of their priorities.
    One of the strengths of Academically Adrift is its empirical base. 
Rather than asserting that students are not learning, Arum and Roksa 
used the Collegiate Learning Assessment (CLA) to measure students' 
cognitive skills. Among the growing number of college student 
assessments, the CLA has so far attracted the most attention; however, 
other assessments are available (such as the College Board's Collegiate 
Assessment of Academic Proficiency or ETS' Proficiency Profile test) 
and more will likely be coming to market as policymakers demand 
measures of the value added of college education.
    My preference is for actual assessments of learning outcomes, such 
as CLA, not the less telling process-oriented studies such as the 
National Survey of Student Engagement (NSSE) and the Community College 
Survey of Student Engagement (CSSE). While some NSSE and CSSE questions 
are more valid on their face than others--for example, those on how 
often students wrote research papers or talked with faculty--overall 
NSSE and CSSE measure process, not outcomes, so their correlation with, 
say, graduation rates, is low.\2\
    \2\ For example see FALSE FRONTS? Behind Higher Education's 
Voluntary Accountability Systems by Andrew P. Kelly and Chad Aldeman. 
Available at: http://www.educationsector.org/usr--doc/False--Fronts.pdf 
and Assessing NESSE by Mark Schneider. Available at http://
    There are questions about the cost of CLA (and other such 
assessments) and questions about how students approach low-stakes 
tests. But even more important are questions about the role the federal 
government should play in college assessments. Within those 
constraints, Congress should continue to monitor the progress of 
efforts to evaluate how much students learn and how much college helps 
them build their skills.
Student Success in the Labor Market
    While improving measures of student learning and student progress 
are important, ultimately we need to assess the extent to which labor 
markets are validating the level and usefulness of the skills college 
graduates possess.
    About half the nation's states can now link student-level data that 
document each collegian's experiences (including major field of study) 
to unemployment insurance records that can track post-graduation 
earnings. These data let us compare the returns on the investment 
students and taxpayers have made in, say, a student with a bachelor's 
degree in sociology to the investments in a similar student who earned 
a bachelor's degree in English literature from the same campus.
    Perhaps even more important, these linked data let us measure the 
returns to students with the same credential coming from different 
campuses. Students and policymakers can therefore compare how 
successful students with, say, a bachelor's degree in materials 
sciences from one school match up to students with the same degree from 
another campus. While higher education is about many other things 
besides labor market success, for most students, their families, and 
state policy makers, higher education is the ultimate economic 
development strategy. So all need to know how students fare after they 
    On September 18, 2012, I released data documenting the first-year 
earnings of graduates from programs across public institutions in 
Tennessee. These data document how much variation there is in the 
earning power of graduates from diverse fields of study--but the data 
also show how much variation there can be in the earnings of graduates 
from the same field of study across different institutions.
    As this graph from the report shows, there is nearly a $15,000 
difference in first-year earnings of bachelor's degree holders in the 
same area of study, the health professions, from the University of 
Memphis versus graduates from the University of Tennessee. A smaller 
gap, but still around $7,000 in first-year earnings, separates 
graduates from the University of Tennessee in Multi/Interdisciplinary 
Studies from graduates from East Tennessee State. Note also that while 
Tennessee State graduates in Health professions lagged every other 
campus, their graduates in Multi/Interdisciplinary Studies were the 
highest paid in the state, on average, for students with this major. 
This reinforces the need for information about specific programs--
because, to repeat, success often is not uniform across programs or 
across institutions.

    Tennessee data, not presented here, also show how well many 
students with technical two-year degrees from community colleges do in 
the job market--where often their wages exceed those of students 
earning a bachelor's degree. And, like the chart above, the data also 
show how much earnings variation there is between graduates of 
different community colleges in the same field of study.
    Students and their families should have this information at their 
fingertips so they can make better informed decisions about where to 
enroll, what to major in, and how much debt they might comfortably take 
on relative to their likely earnings. About half the states have linked 
their student-level data with the unemployment insurance wage data (an 
effort supported by the federal State Longitudinal Data Systems grant 
program). But while many states have linked these data, few states have 
made those linked data known or easily available to the public, to 
individual campuses, or their state legislatures.
    I am working with six states--Arkansas, Colorado, Nevada, 
Tennessee, Texas and Virginia--to get measures of the economic success 
of graduates into the public sphere. The Tennessee data and an 
accompanying report were released September 18th. Arkansas data were 
released in August, and the Virginia data will appear in October. The 
data for Arkansas and Tennessee are easy to search and compare at 
www.collegemeasures.org and the other states' data will be made 
available in the next few months.
Cost of Degrees
    Finally, we need more accurate data on the cost of producing 
college degrees. And let's not confuse cost with price here.
    Most consumers worry about price and know little about cost. If we 
go to Wal-Mart to buy a roll of paper towels and the price is $1.00, 
the fact it may cost 30 cents to produce is rarely on our radar 
screens. Consumer ignorance of cost is even more prevalent when 
government subsidies cloud the difference between price (what we pay 
for something) and cost (what it costs to produce it). When we look at 
a highly subsidized service, such as higher education, the divergence 
between price and cost can be substantial. In short, a college diploma 
that carries a low price tag can cost far more than people realize.
    Any discussion of the cost of degrees must be attuned to their full 
cost, including taxpayer subsidies, and must be standardized by success 
(e.g., number of completions). Without taking both factors into 
account, taxpayers will be left with the false impression that a degree 
or certificate is cheap (because tuition price is low), even though it 
may be quite expensive when all costs are totaled.
    We know that costs are driven by such things as (a) the mix between 
upper division specialized courses versus lower division general 
education ones and (b) the mix of majors--after all, physics labs cost 
far more than language labs. And the mix of students and majors also 
may vary with each campus' particular mission. True degree costs, then, 
must reflect all these variables.
    Many accounting issues also need to figure in any discussion of 
degree costs. For example, how should we allocate spending on research 
and administrative support? We have little information on capital 
costs, which in many campuses exceed operating costs. In short, the 
budgets of most higher education institutions are both sketchy and 
opaque, featuring little of the true grist needed to even start 
tabulating what a student's education costs taxpayers or how much 
campuses spend per degree.
    We also have no reliable way of estimating how much the tax exempt 
status of public and not-for-profit colleges and universities costs 
taxpayers. In more and more cities, for instance, conflicts are 
emerging between ``town'' and ``gown'' over payments in lieu of taxes 
(PILOTS). Fiscally strapped municipalities where tax exempt 
institutions represent a significant share of their potential tax base 
(Boston and Providence come to mind here) are looking to campuses for 
some form of payment--but under current law payment is at the campus' 
discretion. And tax exempt institutions pay no income or sales taxes--
in contrast to for-profit education systems, for which corporate taxes 
are likely over 10% of revenues and sales taxes 1 or 2 percentage 
points. These exemptions are real taxpayer costs but are ``off the 
books'' so often go unnoticed.
    Given these, and other related issues, we have no way of knowing 
how much taxpayers are investing in degrees through direct 
appropriations and through subsidies. And without an accurate cost 
accounting, it's hard to begin to assess the rate of return to 
taxpayers for their investment in higher education. I have been 
exploring this work with Jorge Klor de Alva, president of the Nexus 
Research and Policy Center. Last year we published a study of 
bachelor's degrees and are now studying taxpayer returns on associate's 
    \3\ See Who Wins? Who Pays? The Economic Returns and Costs of a 
Bachelor's Degree by Jorge Klor de Alva and Mark Schneider. Available 
at http://www.air.org/focus-area/education/
Risk Adjustment
    Higher education institutions in the United States vary widely in 
their missions, the students they serve, and the resources they have to 
educate those students. Many argue that a ``one size fits all'' 
approach to any metric is unfair to the institutions that are serving 
``nontraditional'' students--the majority of students in postsecondary 
education today. To compare students' college or labor market success 
in a highly selective not-for-profit college or public flagship school 
to that of students in a regional public four-year campus is clearly 
unfair. One solution to this problem is to establish risk-adjusted 
metrics that would allow us to compare individual campuses with their 
students' characteristics taken into account.
    Risk-adjusted metrics are not a new idea. For example, hospital 
outcomes are often compared using measures that take into account their 
missions and clientele. It's understood that patient populations in 
community hospitals vary greatly from those in, say, trauma centers. In 
higher education, we need some agreement on which student and campus 
characteristics need to be taken into account, perhaps starting with 
the risk factors that NCES has identified,\4\ but developing consensus 
around variables and methods requires more work. And we must take care 
so that risk adjustments don't let poor-performing campuses off the 
hook. A campus with a 25 percent graduation rate might have a ``risk-
adjusted graduation rate'' of 35 percent, but is 35 percent good 
    \4\ NCES has identified a series of risk-factors. These include 
delayed enrollment, part-time attendance, financially independent, 
having dependents or children, single parent, and no High School 
diploma. See http://nces.ed.gov/das/epubs/
Consumer Information vs. Regulation
    Let's assume that over time we develop better metrics to gauge the 
performance of our institutions of higher education. Then what? The 
U.S. Department of Education's effort to regulate based on Gainful 
Employment shows the risks of getting too far ahead of the quality of 
the data.
    As is well known, a federal court ruled this past summer that the 
repayment ratio, one of the Department's three Gainful Employment 
metrics, was ``capricious and arbitrary'' and that no research backed 
up its 35% threshold for imposing penalties on campuses. While the 
Department's right to regulate on GE was upheld by the court, the 
current effort has once again hit a major stumbling block.
    The problem here, I believe, is that the Department has been so 
focused on Gainful Employment as a regulatory issue that it has 
neglected an equally crucial role--getting the information it has 
collected into the hands of students and their families in an 
understandable format. The huge effort expended on the three regulatory 
ratios (debt to earnings; debt to discretionary earnings; repayment 
rates) meant that too little was paid to what is arguably the most 
important piece of information in the entire GE data release in June of 
2012: the average earnings of graduates of covered programs. Indeed, I 
have been told that there was serious discussion about not even 
releasing earnings data at all!
    While the Department of Education has made some moves toward making 
its data more consumer friendly, its Gainful Employment efforts missed 
opportunities to be more useful to students. For example, in its June 
2012 release of the Gainful Employment data, it had a column of data 
labeled ``debt to earnings annual rate denomina.'' In fact, this is the 
average earnings of graduates from thousands of programs throughout the 
    These earnings data contain valuable information not conveyed by 
the ratios. Here, for example, are the average earnings of graduates 
from four of the most commonly offered programs in California covered 
by the GE data.

                Program                   Average    Minimum    Maximum
Cosmetology............................    $11,119     $7,141    $16,912
Message Therapy........................    $14,339     $8,306    $21,034
Medical Assistant......................    $16,257     $8,951    $27,175
Licensed Practical Nurse...............    $38,838    $20,340    $68,871

    Earnings data reported in dollar terms convey information 
understandable by most people. Ratios don't. Indeed, the regulatory-
based ratios could easily lead to poor decisions: consider that for 
cosmetology, the average debt to income ratio was 3.6%, lower than any 
of the other programs shown above, and the maximum ratio was 11.8%, 
below the 12% ``trigger'' of the GE regulations. Yet, graduates of 
cosmetology programs earned far less than graduates from other 
    Unfortunately, these simple dollar figures can be hard to find. In 
its downloaded data set, the Department, as noted, unhelpfully labeled 
them ``debttoearningsannualratedenomina.'' And the entire Gainful 
Employment data base was released as a ``flat file'' consisting of 
almost 14,000 lines of data, so locating data for a program or 
comparing programs across institutions isn't for the faint of heart. 
That's why I created a far more user-friendly interface that can be 
found at http://collegemeasures.org/gainfulemployment/.
    Clearly, given the amount of taxpayer money invested in our 
colleges and universities, the government has an interest in making 
sure that the money is not spent frivolously. And the rate of return on 
both student and taxpayer investments in higher education matters a 
lot. The problem is that most of the data we have now are not precise 
enough to let us pick firm cut-off points fairly--for example, it is 
difficult to justify disqualifying a school with a repayment rate below 
35% from participation in Title IV programs but not a program with a 
35.1% repayment rate. However, if we view these data as informing 
consumer choice and seek to create reliable tools to allow students, 
their families, and their government representatives to view these data 
within a comparative framework, we can increase accountability by 
empowering consumer choice.
    I also believe that had we approached these data with a consumer 
information framework rather than a regulatory one, the Department 
might have been able to make progress resolving one of the most severe 
limitations on the current data: it could have expanded the coverage 
from just the for-profit sector to both public and not-for-profit 
    Reauthorization of the Higher Education Opportunity Act (HEOA)
    Measurement of student success can be improved and IPEDS can and 
should be modernized. The reauthorization of HEOA provides such an 
opportunity. Some of the issues touched here may require more time to 
resolve and may need to be addressed outside of HEOA. (For example, 
some of IPEDS' value in documenting higher education finance is limited 
due, at least in part, to shortcomings and differences in GASB and 
FASB).\5\ Assessing student learning is a step too far for Congress to 
undertake given the current state of the science of assessment and 
given legitimate concerns about the scope of federal intervention.
    \5\ Public institutions follow Governmental Accounting Standards 
Board (GASB) standards, and private (for-profit and non-profit) 
institutions follow Financial Accounting Standards Board (FASB) 
standards. Each Board has a distinct mission and IHEs following these 
different standards report data differently, creating challenges in 
    However, we can and should improve our measurement of labor market 
outcomes, and Congress has the right and the obligation to ask what 
hundreds of millions of dollars in state longitudinal grants has bought 
us in terms of information that helps students, their families, and 
taxpayers make the right to decisions about higher education.
    Chairwoman Foxx. Thank you, Dr. Schneider.
    Dr. Hallmark?


    Mr. Hallmark. Madam Chair, Ranking Member Hinojosa, I am 
honored--and I am James Hallmark and I am the vice chancellor 
for Academic Affairs in the Texas A&M University System. And we 
use data in the A&M system across a broad range of endeavors 
from P-12 to traditional analytics to student learning 
outcomes. I have long advocated data-centered decision making, 
although often universities base decisions more on anecdotes 
and innuendos.
    Furthermore, the research on human decision making 
indicates that we are more often likely to use data to confirm 
decisions we have already made, rather than base decisions on 
that data. An example in my system is that the students and 
parents often select Texas A&M University for the aggie ring 
and its traditions more so than the 92 persistence rate and 
retention rate of the institution.
    Still, we cannot make data-centered decisions without data. 
And universities are awash in data. The challenge is less in 
developing data--developing data than in transforming the 
existing data into usable information. And while doing so we 
most commonly focus on a handful of student success metrics and 
efficiency majors.
    Student success majors are items such as retention rates, 
time to graduation, graduation rates, number of graduates. For 
the public, understanding these terms is problematic in itself. 
I know that a retention rate refers to the percentage of first-
time, full-time fall freshmen taking at least 12 hours 
returning the next fall. But I would assert that that nuance is 
not known or understood by consumers of the information.
    I have a college freshman daughter and even I found the 
language confusing as I helped her negotiate college admission 
and FAFSA this past year. So, I cannot imagine how overwhelming 
it might be for someone out there who is not familiar with the 
    But even when understanding the terms, the data must be 
interpreted within context. For example, a university with a 48 
percent, 6-year graduation rate might be performing better than 
a university with a 68 percent, 6-year graduation rate, 
depending on the demographics of the students admitted. 
Institutions can also manipulate those metrics by denying 
admission to students with the greatest needs, even though 
doing so would not be in anyone's best interest except the 
institution in reporting these metrics.
    Also, the measures may not adequately address the full 
range of student goals. Many community college students seek a 
foundation for transfer or a skill necessary for a specific 
job. Yet, when those students leave before graduation they are 
classified as failures in our data systems, even though the 
institution provided exactly what that student needed.
    We must also be careful in what we measure and what we 
reward when considering efficiency metrics. For example, 
expenditure per full-time student equivalent is a common 
efficiency measure. And some view this as a measure of quality.
    In fact, U.S. News & World Report's rankings put a 10 
percent weight on that particular measure, arguing that 
generous first student spending indicates that a college can 
offer a wide variety of programs and services. While others 
would argue the same metric is a measure of efficiency; that we 
must reduce expenditures to provide access and reduce student 
indebtedness. Another perspective is that this measure is a 
function of institution size, as larger institutions benefit 
from economies of scale.
    The point is that we have to be wise in interpreting this 
data. We must understand the variances in institutional mission 
and the size and its impact on these variables.
    In A&M Systems analytics project, all of our institutions 
have identified what we call stretch goals regarding 
expenditures. Some of our institutions seek to increase their 
expenditure per full time student equivalent, actually increase 
it because of needed additions in student support or an 
expanding, changing mission, while other A&M System 
institutions seek to lower their expenditure levels for 
    Now, I will close with some comments on the A&M Systems 
data project. And as well I want to mention the Voluntary 
System of Accountability. In the A&M System we have set forth 
as our first step in this project to prepare teachers 
differently, to transform how we are preparing teachers, 
educational administrators, focus on STEM education and our 
research across the broad spectrum of education. But it is not 
enough to say that we are transforming public education. We 
have specific measures to assess our progress and hold 
ourselves accountable.
    Now, once a student enters an A&M System institution we 
follow their progress to meet our student success metrics, and 
our metrics are common to those used across the country. But in 
addition we track what we call governance metrics outcome based 
funding metrics and excellence metrics. And the key data in 
each category is accompanied by stretch goals with targets for 
2015 and 2020.
    Finally, the A&M system is addressing what most projects 
are not, and that is student learning outcomes. It is tricky, 
but we are doing it anyway. It is not enough to graduate 
students. We are collecting data to know if we adequately 
prepare our graduates for the next stage in their life.
    Finally, I only have 20 seconds, but I want to mention the 
Voluntary System of Accountability. I do not know of any better 
national collaboration to standardize data and make it 
understandable. It is a continuous tweaking process. But it is 
an excellent model.
    VSA is sponsored by APLU and ASCU and eight of the 11 
universities in the A&M System participate in VSA. And much 
like the A&M System's effort, the VSA is completely transparent 
with links to the VSA data on the front page of every VSA 
member's Web site.
    Thank you.
    [The statement of Mr. Hallmark follows:]

     Prepared Statement of Dr. James Hallmark, Vice Chancellor for
             Academic Affairs, Texas A&M University System

    Good morning. I'm James Hallmark, Vice Chancellor for Academic 
Affairs for the Texas A&M University System (the A&M System). The A&M 
System consists of nearly 125,000 students spread across our 11 
universities and our Health Science Center. In addition, we count 7 
state agencies among our number, all aligned to serve the education, 
research, and service needs of Texans.
    I am particularly pleased to visit with you about university data 
projects, and more importantly using data to provide valuable 
information to students, parents, institutions, taxpayers, and elected 
    I have long advocated data-centered decision making. Taxpayers, 
regents, lawmakers, parents, and students all need data to make good 
choices with their money, their policies, and their futures. And while 
every university leader asserts decisions are based on data, my 
experience has been that university decisions are most often based on 
anecdotes and innuendos. A good story often outweighs a hundred pages 
of statistics indicating otherwise. And certainly the research on human 
decision making indicates we typically use data to confirm decisions we 
have already made. For example, I am confident parents and students 
select Texas A&M University for the Aggie ring and traditions more so 
than for the 92% persistence rate.
    Though we will continue to be influenced by non-data based factors 
in making choices, we cannot make data-centered decisions without 
usable data.
    Universities have long been awash in data. The challenge is 
generally less in developing data than in transforming the existing 
data into forms understandable to the public and decision makers. Only 
then can we understand the data and use it, and also recognize any gaps 
in our existing data.
    The A&M System has embarked on an ambitious data project across a 
broad range of endeavors--literally everything from PK-12 to 
traditional analytics to student learning outcomes--with a goal of 
being accountable to the public and transparent in that accountability. 
We employ data to help us more wisely use finite resources with which 
we are blessed to serve our students and our state. Significantly, this 
project has the full support of Chancellor John Sharp and the chair of 
the Committee on Academic and Student Affairs, Regent Elaine Mendoza, 
and the project is led by a vice chancellor, myself. This is important 
as it requires powerful and influential leadership to guide a major 
data project from infancy to maturity. We would not be successful in 
the A&M System without it.
    I will provide what I hope are useful insights into the use of data 
in aiding constituent decisions, while also providing information on 
the A&M System's project. I will also comment briefly on the ``VSA'' 
project (Voluntary System of Accountability) a comprehensive national 
effort using data to aid students and parents in making choices. I 
will, of course, be happy to address any of these matters during your 
    Accountability in higher education typically focuses on a handful 
of metrics, such as persistence rates, time to degree, graduation 
rates, and number of graduates. It may be useful to reference these as 
``student success metrics.'' These are reasonable measures, and are 
important. According to the National Center for Higher Education 
Management Systems, only about 30% of 8th graders will obtain a higher 
education credential in 11 years (the equivalent of a six-year 
graduation rate). In Texas, only about 20% of 8th graders have a higher 
education credential in 11 years. The deficiency in Texas is even more 
alarming for African-Americans (11.4%) and Hispanics (11.6%). Simply, 
for the good of our society we must be held accountable for moving the 
needle on these metrics. We can only move the needle if we measure and 
track the information and systematically apply the findings as part of 
our decisions on how we structure our institutions and processes.
    For the public, the greatest challenge in using these student 
success metrics may be in understanding the language. In higher 
education, we know that a retention rate refers to a specific measure, 
but I would assert the term is not meaningful to consumers. The 
retention rate references first time full time cohorts of freshmen 
entering the institution in a fall semester who remain enrolled at the 
same institution the following year. This is a useful and important 
measure for lawmakers, university administrators, and regents, but for 
first generation students and their parents the language and utility of 
the measure is not readily apparent and even confusing. Personally, I 
am a higher education professional, and yet I found--as my college 
freshmen daughter negotiated college admission and FAFSA--my knowledge 
of the game had its limits.
    Beyond understanding the key terms, the data cannot--or at least 
should not--be interpreted without context. For example, a superficial 
understanding would conclude a university with a 68% six-year 
graduation rate is performing better than a university with a 48% six-
year graduation rate. However, the reverse may be true. The 68% 
university may be underperforming based on the academic preparedness 
and socio-economic status of its incoming students, while the 48% 
university may be over-performing based on the input characteristics. 
Allow me to explain: if an institution is primarily drawing its 
students from white non-Hispanic households, where both parents are 
college graduates, and where few are Pell eligible, a 68% six-year 
graduation rate is poor. Meanwhile, an institution drawing primarily 
from underrepresented populations with a high percentage Pell eligible 
and mostly first generation, a 48% six-year graduation rate is 
remarkable. Lawmakers, regents, parents must understand the different 
missions of these institutions in interpreting this data.
    It is also important to prevent institutions from ``gaming'' the 
metric. For example, institutions can artificially improve persistence, 
retention and graduation rates by truncating the freshman class, 
eliminating those students with the greatest needs. An access oriented 
institution could choose to limit admission to those with an ACT score 
(or equivalent) of 20 or higher and dramatically increase retention, 
persistence, and graduation rates, even though doing so may not be in 
the best interests of the community, the region, or the nation.
    It should also be noted that the standard measures for student 
success do not adequately address the full range of goals of all 
students. This is most evident in community colleges where a 
significant number of students seek a foundation for transfer or a 
skill necessary for a specific job opportunity. Current data reporting 
metrics often underreport the community college's success, even though 
the institution provided what the students needed, because common data 
metrics only report retention and graduate rates at the students' first 
institution. Similarly, one institution in the A&M System is a special 
purpose institution focused on maritime disciplines. A student at Texas 
A&M University at Galveston (TAMUG) who chooses to major in something 
other than maritime will transfer to Texas A&M University and is 
reflected in TAMUG's data system as a ``failure,'' even though that 
student may graduate and become the world's next great agricultural 
engineer. Again, the shortcomings of the standard measures do not 
account for the progress and success of many students. (The Voluntary 
System of Accountability addresses this challenge by tracking students 
across any institution.)
    Moving from a discussion of metrics associates with student 
progression to a discussion of efficiency measures, higher education, 
like all facets of society, must do more with less. We must be publicly 
accountable to those who are providing resources, whether that source 
is public funding, tuition and fee dollars, or philanthropy. We must 
demonstrate that we are being efficient with those resources, that we 
are investing our resources not in frivolous activities--however that 
may be defined--but in activities targeting appropriate service to the 
education, research, and service needs of our students and our service 
    Much like student success metrics, efficiency measures must be 
interpreted with caution. For example, ``expenditure per full time 
student equivalent'' is a common measure of efficiency that is also 
viewed by some as a measure of quality. (And to revisit the concern 
mentioned above, it is also a measure that may not immediately be 
understood by consumers.) Some view this measure as a means of 
assessing quality. For example, the ubiquitous U.S. News and World 
Report ranking places a 10% weight on their version of this measure, 
arguing that ``generous per-student spending indicates that a college 
can offer a wide variety of programs and services.'' Others may use 
this same metric as a measure of efficiency, arguing that we must 
reduce expenditures to provide access and reduce student indebtedness. 
Furthermore, expenditure per full time student equivalent may be more a 
function of an institution's size than anything else, as larger 
institutions benefit from economies of scale.
    The issue I seek to address is not to avoid data reporting or 
accountability related to efficiency, but rather to use the information 
wisely, understanding the variances in institutional mission and size 
and its impact on the variable. In the A&M System's analytics project, 
our institutions have identified ``stretch goals'' for selected 
metrics, including expenditures per full time student equivalent. Some 
of our institutions seek to increase their expenditure per full time 
student equivalent to expand student resources in support of student 
success or in pursuit of a changing mission such as ``downward 
expansion'' from an upper-level only institution into a four-year 
institution. Other institutions who may already offer a full range of 
services have stretch goals to lower their expenditure levels as they 
are seeking efficiencies within their systems.
    My final set of comments will focus on a more complete overview of 
the A&M System's data project. First, we firmly believe we have a 
responsibility to improve public education (PK-12) in the state of 
Texas. No other entity in Texas produces more teachers than the A&M 
System while supporting innovation and leadership through 
groundbreaking research. We at the A&M System have an obligation to 
continue to improve, to transform how we prepare teachers, to better 
prepare educational administrative leaders to support and lead these 
transformations, to renew our focus on STEM education particularly in 
the primary grades, and to expand the research we conduct across the 
broad spectrum of education.
    But it isn't enough to say we are going to transform public 
education through our focus. We have specific metrics and targets that 
tell us how well we are performing in preparing and supporting 
teachers, in preparing and supporting educational administrative 
leaders, in improving performance in STEM disciplines, and in education 
research. These data are essential in helping us transform public 
    Once the student enters an A&M System university, we begin to 
measure their progress via analytics. We collect and analyze data for 
typical metrics used by universities across the country (detailed 
breakdowns of enrollment trends, for example, such that I can tell you 
how many Hispanic females from Coleman county are majoring in a STEM 
discipline at the A&M System's Tarleton State University campus--the 
answer is 2). We track about 50 variables in this manner, with 
significant ``drill down'' capability to aid students, parents, 
regents, and lawmakers in decision making. With this data, students and 
parents can made decisions on the likelihood of quick progression to 
graduation, or regents can track trends in research expenditures, among 
other possibilities.
    We have also organized our data to reflect specific interests 
within Texas. One set of data focuses on ``Governance,'' data our Board 
of Regents has identified as central to their decision making task. 
Another set of data focuses on ``Outcomes Based Funding,'' data the 
state of Texas has proposed for influencing institutions' state 
funding. Yet another set of data focuses on ``Excellence,'' data that 
tracks how each institution is moving toward better fulfilling its 
mission. Key data in each category is accompanied by stretch goals, 
targets for 2015 and 2020.
    Finally, within the A&M System's paradigm, we are addressing what 
most projects are not--student learning. Too often data projects 
neglect to systematically measure the knowledge and skills of the 
students who graduate from our institutions. It is not enough to 
graduate students. Instead, we must collect evidence to know we are 
adequately preparing our graduates for the next stage in their life, 
whether that is graduate school, professional school, or the workforce.
    At the A&M System, we are collecting data to demonstrate a value 
added to the student via their encounter with our universities. Our 
identified outcomes are not unique, relying heavily on national models, 
such as the American Association of Colleges and Universities' 
``Essential Learning Outcomes'' within the ``LEAP'' initiative. These 
models provided a foundation for us to choose to hold ourselves 
accountable that our graduates will communicate well, have outstanding 
critical thinking skills, be ethical decision makers and engage 
responsibly in society, have a global perspective and an appreciation 
for cultural diversity, problem solve well, integrate the broad 
knowledge obtained through their undergraduate experience, and possess 
the knowledge specific to their discipline of study. We are entering 
into a data management and reporting project assessing each of these 
learning outcomes.
    I will close with a brief reference to the VSA--the ``Voluntary 
System of Accountability''--mentioned earlier. The VSA was developed in 
2007 to better demonstrate public university accountability and 
transparency, particularly in the areas of access, cost, student 
progress, and student outcomes. The VSA is sponsored by the Association 
of Public and Land-grant Universities and the American Association of 
State Colleges and Universities and includes 60% of all public 4-year 
    Eight of the 11 universities in the Texas A&M University System 
participate in the VSA and publicly report a common set of data on the 
VSA College Portrait. The VSA College Portrait provides common, 
understandable and useful data for students, families, state officials, 
policy makers, and accreditors. As an example of national collaboration 
to provide common, understandable and useful data, I know of no better 
model. And much like the A&M System's effort, the VSA is completely 
transparent, with links to the VSA data on the front page of every 
VSA's member's website.
    Ultimately, it is up to the student to succeed, and many in higher 
education blanch at being held accountable for the behaviors of 18 year 
olds. Regardless, given the resources devoted to higher education and 
the demands and needs of society for higher education to produce 
contributing members of society, accountability is unavoidable. The 
wise approach to accountability assures we are being accountable for 
the right stuff and interpreting the data wisely.
    Chairwoman Foxx. Thank you very much.
    Dr. Cruz, you are recognized for 5 minutes.


    Mr. Cruz. Good morning, Chairwoman Foxx, Ranking Member 
Hinojosa. Thank you for the opportunity to testify before you 
this morning.
    My name is Jose Cruz. I am the vice president for Higher 
Education Policy and Practice at The Education Trust. And I am 
a former vice president of Student Affairs at the University of 
Puerto Rico System.
    Going to college has always been a major financial 
commitment. But given the cost of college today, the 
consequences of that decision have grown exponentially. Yet, 
access to quality, relevant information to inform that decision 
is grossly limited. Fortunately, current federal databases 
provide a foundation upon which to build a data system that 
will not only better inform parents and students, but also help 
decision makers develop effective policies that benefit 
students, and help institutions operate more efficiently.
    One such database is the National Student Loan Data System, 
NSLDS. The NSLDS compiles data on federal student loans, but it 
has two major shortcomings. First, it does not compile data on 
private student loans.
    And second, it does not include a flag to indicate whether 
a particular student has completed his or her program of study. 
By expanding this data system to include information on private 
student loans, and a completion indicator, NSLDS could provide 
much more useful information about the debt levels and types of 
debts incurred by student completers and non-completers alike.
    The main source for post-secondary educational data is the 
Integrated Post-Secondary Education System, IPEDS. IPEDS has a 
treasure trove of information, but the usefulness of this 
information is limited because the data is incomplete. As 
Ranking Member Hinojosa stated previously, IPEDS does not 
provide graduation rates for anyone but first time, full time 
students. And the information it does provide on graduation 
rates of 2-year institutions is significantly limited.
    IPEDS also does not provide graduation rates for Pell Grant 
recipients, nor does it provide graduation rates for Stafford 
Loan borrowers. And it does not provide information on student 
job placement rates and earnings.
    Collecting and reporting these few additional pieces of 
information would go a long way toward ensuring that students 
and their families, as well as the institutions themselves, 
have a true picture of an institution's ability to support all, 
not just some, of their students through graduation.
    Now, I know that when we talk about additional reporting, 
whatever the context, the issue of burden on the organizations 
tasked with that reporting always arises, and rightfully so. 
But we cannot let the fact that some effort is required to be 
the sole determinant in the conversation.
    If reporting arms students with the information they need 
to make good decisions, and provides institutions insight into 
how they are doing and where they need to improve, then it is 
hard to make the argument that such reporting is too 
burdensome. Rather, it is indispensable.
    It is true that a lot of data already exists in federal 
databases, state longitudinal databases, state longitudinal 
data systems and system institutional research offices. So, we 
should use it to full effect. But we need someone to gather and 
validate the quality of such data and make it available in a 
simple and usable format for students, policymakers and 
institutional leaders to benefit from it. And this someone can 
and should be the federal government.
    The fact is that we know that current reporting can be 
enhanced without overly burdening those doing the reporting. 
Dr. Hallmark has talked about the Voluntary System of 
Accountability. And Dr. Schneider talked about the Complete 
College America and NGA work in this area.
    And we have also been working with some other systems and 
institutions in the past 5 years in a different initiative 
called the Access to Success Initiative where 22 public higher 
education systems with 312 campuses serving over 3.5 million 
students have developed a set of common definitions as well as 
metrics, protocols and tools to track overall enrollment and 
completion rates for the part-time, for the transfer and for 
the low-income students that are currently missing from IPEDS. 
And they have been voluntarily reporting this data for the past 
5 years.
    So, in closing, I would say that we have no time to waste. 
College tuition and fees are growing almost twice as fast as 
health care costs and about four and a half times as fast as 
inflation. And student loan debt now exceeds $1 trillion, and 
outpaces credit card debt in this country.
    This at a time when almost half of our students in our K-12 
schools are low-income, and when our youngest generation is 
just barely better educated than their parents. Data 
transparency for students, parents, policymakers and 
institutional leaders of higher education has never been more 
critical. Thank you.
    [The statement of Mr. Cruz follows:]

    Chairwoman Foxx. Thank you, Dr. Cruz.
    Dr. Fitzsimmons, you are recognized for 5 minutes.

                     SHENANDOAH UNIVERSITY

    Ms. Fitzsimmons. Good morning. I am so pleased to be here 
with you today.
    I am president of Shenandoah University. We are located 70 
miles to the west of here in the beautiful Shenandoah Valley in 
Winchester. We educate 4,000 students every year; half 
undergraduate, half graduate.
    We are not a liberal arts institution. We are not a 
professional institution. We have a foot sort of firmly in both 
camps. So, we are concerned about educating broadly, but also 
about educating our students for a career.
    I am here representing Shenandoah University and NAICU. I 
am also here as a PhD in political science, which means I 
believe deeply in data, and that good data helps inform great 
discussions and dialogue in the classroom and here in Congress. 
But I hope as you move forward in these discussions about data 
and higher education that you will think about data with the 
following questions.
    First is, do we really need more data? I brought with me a 
stack--a representative stack of all the information that is 
available to the general public on just one institution, mine, 
Shenandoah University. Anyone in the public can read all of 
this. It has got financial information, student demographic 
information. It has got average class size and so many other 
things. We have got tons of data out there. Let us make sure 
that we are using it first.
    If you want to add more layers, could you please take away 
some, because the administrative burden is tremendous. It is 
worth it if the information is well used by taxpayers and 
students. But it is a tremendous financial burden. That was the 
second area, the burden.
    The third area is if you are going to move towards a common 
dataset, if you are going to push taxpayers and students to 
focus on one set of numbers that is generated centrally by the 
federal government, I hope you will be aware that you could be 
inadvertently creating a situation in which it would push 
colleges and universities not to take a bet on high-risk 
students, on low-income students.
    Just this fall at Shenandoah University when we welcomed 
our freshmen class, one of our students came to us not from his 
home, but from a homeless shelter; straight from the homeless 
shelter to the dorms at Shenandoah University. He does not have 
family support. He does not have a family history of college. 
He is not a great bet to graduate. But he deserves a chance.
    If you would force me to be compared all the time against 
other colleges and universities only focused on graduation 
rates, retention rates, I might not have been able to accept 
him. But he deserves an opportunity. We want to make sure not 
to create a common dataset that perversely leads us to exclude 
those who perhaps need college the most.
    Next, students I find are actually really interested in the 
intangible. I asked my students in preparation for today, what 
is it that drove you to choose a college? What did you think 
about? What kind of data?
    They wanted to know do we have the majors that they were 
looking at, that they were considering. Will we help them 
prepare for certain jobs? But they also have other questions.
    Some students are looking for a highly Christian 
institution. Some want an institution that is not religious at 
all. Some want a left-leaning or right-leaning institution. 
Some want a place that is very environmentally focused. Others 
want to know that as a home-schooled high school student that 
they will feel at home at that institution.
    It is very much for some of them, for many of them about 
the fit. Does it feel right? Will these faculty members 
motivate me, stand by me until graduation? That is going to be 
really hard to measure in a common dataset.
    I would also like to remind you that we have an incredible 
peer review accreditation system in this country. In terms of 
accountability, both in terms of academic quality and also 
financial responsibility we are harder on each other than 
anyone could possibly be on us.
    And finally, I would like to urge you to let the market 
bear some of the burden. We exist in this wonderful free and 
fair country politically and economically. And there is room 
for the market to do work here as well. I see it every day.
    We know the faculty members that are not keeping current on 
the information, that are not offering courses students want or 
are not great engaging teachers. Students vote with their feet. 
They do not sign up for those classes.
    The same thing happens for colleges and universities. If 
you are not offering the majors that students are looking for, 
if you are not paying attention to students' needs and parents 
and taxpayers as well, they will vote with their feet. They 
will not apply. They will not come to our institutions.
    There is a lot of data available. I am happy to create more 
data for you to work with you on it if it is useful to the 
taxpayers and to the students.
    I would end by saying I have the pleasure every day of 
working with the students that you help, the ones who could not 
go to school if it were not for your financial support. And I 
am grateful on behalf of those students, and all those across 
the country. But as you think of data, I also hope that you 
will use data, create data in ways that will urge American 
higher education to deepen, not dampen our commitment to 
provide ability and opportunity to students across our country, 
the many, many different kinds of students that exist. Thank 
    [The statement of Ms. Fitzsimmons follows:]

        Prepared Statement of Dr. Tracy Fitzsimmons, President,
                 Shenandoah University, Winchester, VA

    Chairwoman Foxx, Ranking Member Hinojosa, and members of the 
subcommittee, I appreciate having the opportunity to appear today to 
discuss higher education data issues. I am Tracy Fitzsimmons and I am 
president of Shenandoah University, located 70 miles to the west of 
Washington D.C., in Winchester, Virginia--the top of the beautiful 
Shenandoah Valley.
    Shenandoah University educates 4,000 students in its undergraduate, 
master's and professional doctoral programs. Shenandoah is not a 
liberal arts institution, nor is it a pure professional school. Rather, 
Shenandoah offers its students the broad education necessary to be 
active and informed citizens, while also training those students for a 
specific career. The 68 degree programs at Shenandoah are housed across 
six schools: Business, Conservatory, Education & Human Development, 
Arts & Sciences, Pharmacy, and Health Professions.
    In real-life terms, Shenandoah educates the students who will be 
the police officers and teachers and accountants of our communities; 
the nurses and physical therapists who will care for us as we age; the 
environmentalists and entrepreneurs who will compel our country to do 
more and better; and the performers of Broadway or the Kennedy Center 
who will touch our souls with their artistic performances.
    Today, I represent not only Shenandoah University, but also the 
National Association of Independent Colleges and Universities (NAICU), 
a public policy association for non-profit higher education that 
represents more than 960 private, non-profit colleges and universities 
and more than 70 specialized independent college associations. NAICU 
has long been involved with issues related to the collection and use of 
    My thanks to you for holding this hearing. Having a PhD in 
political science, I think good data helps inform good decisions--both 
in the classroom and in Congress. And I think that government has 
contributed to quality data across American higher education. However, 
the question of data--what the government should collect, and how it 
should collect and use it--is central to education policy. As you 
consider this question, I urge you to keep in mind several questions 
about additional levels and means of federal data collection:
1. Will the benefits of new data requirements outweigh the costs?
    There are already reams of data easily available to the public to 
help them make decisions about how to assess colleges in the higher 
education sector.
    For example, many data discussions involve longitudinal data 
systems. These systems are being built in the belief that tracking 
individuals throughout their schooling and on through the workforce can 
assist in developing more successful educational and employment 
strategies. At the same time, they raise serious concerns about student 
privacy--a longstanding concern of NAICU. In addition, we believe that 
some current information is collected without a clear policy purpose, 
retained long after its purpose has expired, and used for unintended 
purposes. Too much data, or out-of-date data, only serves to confuse or 
mislead those who were the intended beneficiaries.
    Frequently, data issues center on the large and growing list of 
disclosures that institutions are required to provide. Like regulatory 
kudzu, it seems that every new problem gets a new proposed disclosure, 
but none of the old ones ever go away. We are concerned that such 
excessive requirements place a great administrative burden on 
institutions--a burden that I and other presidents are certainly 
willing to bear if it serves a productive purpose. I will address this 
issue later in my testimony.
    For now, however, you have asked me to consider whether it is 
possible and cost-effective to identify a limited set of data upon 
which everyone can rely in evaluating institutions? Can policy gains be 
made while also saving costs in red-tape and money to our universities, 
to our taxpayers and to our students?
2. In the effort to provide students, parents and taxpayers more data, 
        will you implement measures that make it more difficult for 
        colleges to give at-risk students a chance?
    Using retention rates and graduation rates as a be-all, end-all 
measure of institutional worth could lead to this result. The best way 
for any college or university to increase its graduation rate is to 
enroll traditional, high-achieving students--you know to whom I refer: 
the 18 year-olds who have stable families, attended the best high 
schools, flew through high school with an A average, and have 
significant financial means. I, and other presidents, certainly want 
those students to succeed in college--and the odds are in their favor. 
But many of us also want students from the broad spectrum that makes up 
America to be able to have a chance at college * * * the 25 year-old 
single mother, the veteran suffering from PTSD, the C+ student who is 
bright and motivated but struggled to make good grades in high school 
because he was working two jobs to help pay the rent. Shenandoah, and 
many other schools, believe those students also deserve the opportunity 
to go to college. But if Congress takes measures to position graduation 
rates as the key indicator of institutional value, then you will force 
my colleagues and I to narrow the range of applicants we accept. Just 
this fall at Shenandoah, we drove two hours away to pick up one of our 
incoming freshmen students from a homeless shelter--his family had lost 
their home earlier this summer--and Shenandoah has committed to 
providing significant levels of financial support to him. But Federal 
rankings based on graduation rates might have led us to think twice. We 
also welcomed into our freshman class a student who is the youngest of 
six children, the first in his family to go to college, from a 
household where Spanish is the primary language. Both of these young 
men had at least a B average in high school--and I believe in them--but 
I also know that both will face significant challenges in moving 
through college to graduation. But they deserve a chance at higher 
education. If you force colleges to play the graduation numbers game, 
we will think twice about admitting students who are not the absolutely 
best-bet to graduate.
    Furthermore, are you assured of choosing the right measurements? 
Right now, for example, there is much attention paid to retention and 
graduation rates. Yet in fields such as nursing or physician assistant 
studies, completion of the degree program is not the key measure--the 
crucial measurement is how many of the students completing the program 
pass their board exams, because without passing they cannot practice in 
their chosen field.
3. Will your use of data push higher education away from independent 
        thought and creative problem-solving toward equating value only 
        with financial return?
    A sound college education prepares our graduates not only to enter 
the workforce, but it also provides them with a deeper understanding of 
the world around them. Focusing on employment earnings as the primary 
measure of value diminishes the deeper benefits of education, reduces 
the flexibility to address new educational needs, and ignores the very 
real contributions to society by those who choose to pursue lower paid 
service occupations. Right now, in Virginia, the State Council of 
Higher Education is preparing to release to the public a website that 
lists Virginia's public and private colleges according to how much 
money their graduates earn 18 months and five years after graduation. 
While I am pleased that my institution comes out high on the chart, 
there are many institutions of high educational quality that end up at 
the bottom of this list. I am vehemently opposed to creating and 
pushing such data sets to students and parents. What is the message? 
That those colleges who educate future hedge fund managers and 
physicians are somehow more valuable than those who educate our future 
ministers, middle managers, teachers and part-time-worker-stay-at-home-
4. Will your use of data shift the historical focus of need-based aid 
        to students to a focus on institutional aid instead?
    If institution-based metrics such as graduation rates or alumni 
earnings are used to assign federal ``rewards and punishments,'' will 
that mean a shift in federal aid to higher education away from 
individual students to the institutions they attend? Is that really the 
direction that we want to go in a country that traditionally has put 
high value on the individual? As the parent of young children, I know 
well that rewards and punishments induce--whether intended or not--
certain behaviors. Is Congress certain that it wants to send higher 
education the message that if you don't graduate all of your students, 
or if your graduates don't end up in high paying jobs, then you will 
reduce the financial aid we can make available to students? If so, then 
the behavior you will induce will be a narrowing of the field of 
students that colleges see as ``admissible''. In essence, colleges will 
be unable to ``gamble'' on high-need but high-risk students because 
their potential failure could jeopardize the government aid available 
to all other students.
5. Will your use of data fundamentally alter the role of the federal 
        government in higher education--essentially federalizing what 
        has been a pluralistic, local, and entrepreneurial network?
    We have an internationally-respected system of education because it 
is diverse and dynamic; students from across the globe flock to study 
in the United States--even non-Ivy League institutions and those 
colleges tucked away in rural communities have international student 
populations in the 2 percent to 10 percent range. They come because 
America's higher education system is rich in quality and diversity. If 
you create a system of rigid and well-defined data points, that 
diversity will begin to disappear as many institutions will feel forced 
to assimilate their programs and admissions policies to score well on 
the common data set. I caution you against creating a set of data that 
unintentionally will become the governmental version of the U.S. News & 
World Report rankings! If institutions must adhere to a set of 
narrowly-defined priorities and measures the federal government 
establishes, they'll do that, but lost in that approach will be the 
diverse models and creativity that have defined American higher 
education since before the nation's founding.
    While I do not agree with many of these new directions, I encourage 
you to have a purposeful conversation about where Congress wants our 
educational system to go. Similarly, I encourage you to actively reach 
out not only to researchers, but also to practitioners on college 
campuses to get their feedback on what really matters. As Albert 
Einstein famously said, ``Just because something can be counted, 
doesn't mean it counts.'' I fear that many of the millions of us who 
work on college campuses are not actively engaged in, or even aware of, 
the profound policy conversation taking place in Washington.
    The challenge here is in recognizing that the chosen information 
will drive policy outcomes in ways both intended and unintended. There 
is the potential to find ourselves in the dilemma best outlined by the 
age-old fable, The Blind Men and the Elephant. In short, it is the 
story of six blind men who each feel a different part of an elephant. 
Each comes to a different conclusion as to what they have touched (a 
rope for the tail, a spear for the tusks, etc.). Looking at narrow 
indicators of institutional performance could have the same misleading 
effect--especially when we apply those indicators to the diverse array 
of institutions in the United States.
    Similarly, I fear some of the well-intentioned analysts advocating 
innovation in post-secondary education are unaware of the remarkable 
changes taking place on most college campuses. Technology is rapidly 
reinventing how, who, and where we teach. Colleges are offering new 
career programs and serving new student populations. And, more and more 
campus resources are being allocated to match the federal efforts in 
student aid, and helping to make college possible for our increasingly 
needy and diverse college population.
    All of this innovation is happening in higher education because of 
the marketplace. The market has provided higher education with volumes 
of useful products and opportunities that drive our direction--and in 
turn, many institutions and faculty have contributed to the development 
of those new innovations. We academics sometimes like to think that we 
are somehow outside of the market, or exempted from it. But in reality, 
the market is the most important driver of educational creativity and 
quality. Inside the academy, we know which faculty members are the most 
engaging professors, and students ``vote'' with their feet by 
registering, or not, for their classes. Similarly, if an institution is 
of poor quality, students and parents will figure that out--whether 
through social media or through the thousands of data points currently 
available to the public--and they will migrate to other educational 
options, eventually causing the weakest institutions to close. In 
addition to the market, the extensive process of peer-review 
accreditation in this country provides an important level of additional 
quality oversight.
    In order to be effective, markets need transparency in information. 
Today, I am presenting to you--literally--reams of paper documenting 
the information that is readily and openly available to the public on 
just one institution--Shenandoah University. Will more disclosure 
requirements or an over-arching data set really add more to what is 
already there? Or, will it simply add another layer or a narrowing of 
the information available to students and parents as they attempt to 
navigate the higher education sector?
    As the president of a not-for-profit institution of 4,000 students, 
I am proud of working in a field that I believe is essential not only 
to our nation's future, but central to who we are as Americans. The way 
we approach education at Shenandoah reflects not only our national 
traditions, but the history and challenges of our region, and most 
importantly, the unique needs of our students. Shenandoah would be 
different if we were in California or Maine. Shenandoah would be 
different if we were a public or community college, or a purely liberal 
arts institution, or a research university, or an Ivy League 
university. Yet, I am proud of our place as one shining tile in our 
national mosaic of higher education, and I am equally proud of my 
colleagues in higher education who serve different populations in 
different ways. Together we reflect a high quality and diverse system 
that is unlike that of any other nation.
    As a college president, I can also tell you that every decision you 
make here affects us profoundly on campus, in more ways than you can 
realize. If you tell me to improve my graduation rates in a certain 
way, or that you will judge Shenandoah by the earnings of our 
graduates, I am going to respond to that. But if this is done under a 
rigid national formula, bringing the broad swath of American 
postsecondary education under one rubric, I worry that you will 
unintentionally federalize a system that is strong because of its 
diverse and non-governmental foundation. And, ironically, not only will 
choice suffer, but quality will suffer as well.
    I recognize the difficulty here. You see a broad taxpayer 
investment in student aid. You need to ensure it is well spent and well 
used. I have the fun part. I see the human face of that investment. I 
see the low-income, first-generation-to-college student who makes it 
because of our student aid partnership with the federal government. I 
know it is working, but you don't have that on-the-ground view. So you 
need proof. But I worry that your proof could become codified in a way 
that makes it less likely that low-income student is given a chance.
    This, I believe, is the real policy conundrum for this 
reauthorization. We all have the same goal, but our needs for evidence 
are different from the top than from the ground. The resulting 
requirements can also come at considerably higher costs to those of us 
on the ground.
    In this regard, I have also been asked to address today two other 
aspects of the data question: burden and transparency. Specifically, if 
we could agree upon a narrow set of data points on which to establish 
institutional validity, could we then reduce some of the heavy 
regulatory burden and compliance costs for colleges that flow from 
federal, state and accreditor mandates? It is a question worth 
exploring, but one that I am not sure I can fully answer because it 
involves so many layers of independent decision makers, and so many 
entrenched rules.
    Let me give you the campus view of just the federal role in this 
issue. I hear a lot of criticism from Washington that colleges are not 
transparent enough. For example, I was asked to address whether 
colleges should provide more fiscal transparency. From where I sit as a 
college president, we are drowning in fiscal transparency--and at 
today's hearing I am leaving with you a stack of sheets representing 
just some of the data available right now to the general public about 
any public or non-profit private, two or four year, institution in our 
    In 2008, the IRS decided they, too, wanted more fiscal transparency 
and so revised Form 990, the mandated annual filing for all non-
profits. The new form, which took several years to revise, includes an 
11-page, 11-part core form, and 16 schedules. The many reporting 
changes affecting colleges include governance, compensation of 
officers, fundraising, public support, political activity, and related 
organizations. The changes necessitated a major additional workload on 
every private, non-profit college, and added considerable auditing 
    I have with me today, Shenandoah's Form 990. I will leave it 
behind, so you can look at it and tell me what you don't know from 
reading this that you need to know * * * and recall that all of these 
financial data are available to anyone, since the 990s are public 
documents available on-line. Now that our auditors and CFOs are all 
trained on this, now that our compliance software is re-purchased and 
upgraded, are we to expect another new layer of fiscal transparency 
from the Department of Education?
    The year 2008, when the new Form 990 went into effect, was a big 
year for new college regulations because it was also the year that the 
last reauthorization of the Higher Education Act became law. Attached 
is an executive-level introduction, President's Quick Guide to the New 
Law, produced by NAICU for independent college presidents, to help us 
meet the massive campus-wide compliance requirements this legislation 
generated. The guidebook is just a bird's eye view; for example, it 
includes virtually none of the changes made to the student aid 
programs, because our financial aid offices generally know how to 
handle those matters. Simply skim the book to see the kind of changes 
affecting areas beyond student aid, including campus police, technology 
officers, CFO's, institutional researchers, and academic affairs--and 
this in legislation that called for two studies of deregulation--one of 
which has not even been started.
    So, this is the dilemma. Even when Congress tries to deregulate, we 
end up with the new requirements, but no relief on current 
requirements. And, if those new requirements were to measure us by 
narrow standards, and make our system of higher education less diverse, 
we would lose more than we gain.
    I don't want to close without offering some thoughts on emerging 
ideas I see as having good potential. I think the federal government 
can play a positive role in consumer information. There is much 
conversation right now, at the national level, about how to ensure that 
students and families have some basic information on all colleges to 
help them find a ``best fit'' school. I love this idea because, like so 
many of my small-college colleagues, I believe my institution is a 
hidden gem. Shenandoah is just a little more than an hour away from 
Washington, and easily accessible, but I'm sure many of you never heard 
of my university before today. We have capacity to serve more students 
who might find our university a good fit, but I don't have a big 
advertising budget. I love the idea that the federal government might 
help prospective students find us.
    However, let's look at how the current federal consumer 
transparency efforts are playing out. Just last week, a Senate 
committee began consideration of a bill that would require institutions 
to collect a whole new set of detailed data for veterans. I 
wholeheartedly support our providing information that helps veterans to 
make smart choices, but I fear that many of the current proposals will 
not have their intended effect. For example, we estimate that the new 
Senate requirements include almost 30 new items, several of which would 
have to be further broken down by program level. Nearly all of this 
differs from the information that is already being collected by the 
Department of Education.
    Also last week the House approved its own legislation dealing with 
information for students who are veterans. The approach in this measure 
avoids many of the excesses of the Senate proposal, but is not without 
its problems as well. For example, it would require disclosure of 
median Title IV debt levels for all students at an institution, while 
another federal initiative is already calling for disclosure of median 
debt levels ``for completers.'' Having two numbers for the same 
institution that will appear to most consumers to describe the same 
thing confuses more than it enlightens.
    And both of these bills come on top of several other current 
efforts to provide more consumer information by the Administration. 
Colleges have been asked to sign on to the ``Principles of Excellence'' 
included in an executive order (EO 13607) dealing with veteran- and 
military-related education programs. Among other requirements, 
institutions agreeing to the principles must provide all military-
related students with a new Financial Aid Shopping Sheet for the 2013-
14 award year. Although a final version of the Shopping Sheet has yet 
to be developed, the Department of Education is already pressing 
colleges to provide it to all incoming students. The Shopping Sheet is 
not to be confused with the College Scorecard or the Aid Offer 
Comparison Tool, also under development. And, all of these are in 
addition to the College Navigator, the Department of Education's 
website intended to help consumers chose a college that best fits their 
    My students have a term for this: TMI!
    Not only is it too much information, but nearly all of these 
proposals are based on the various factors policy analysts want 
students to use when making a decision on where to go to college--
rather than the ones they actually use.
    Let me tell you how the college selection process goes from my 
vantage point: high school students, often with their parents, decide 
how far away from home they are willing to go for college. They get 
advice--solicited or not--about which college would be best for them 
from relatives, high school teachers and counselors, from peers, from 
Facebook ``friends.'' They narrow their search in so many ways! Some 
already believe they know what they want to study, so they look for a 
school that has their intended major or career path; while others feel 
more comfortable with a broad, liberal arts environment. Some settle on 
a place like Liberty University because it is strongly Christian and 
others lean toward Swarthmore because it is deeply academic. University 
of California at Berkeley attracts those interested in research--or 
often those with a particular political preference. Northern Virginia 
Community College is the obvious pathway for many in the region who 
want or need to live at home and save money while also presenting them 
with lots of opportunities. At Shenandoah, we find that students 
initially look at us and apply to SU because of our location, and also 
because of the variety and quality of academic programs. But in the 
end, those who choose to attend Shenandoah often say they do so because 
of the warmth and personal touch and faculty support they experienced 
while visiting campus. In effect, our 9:1 student-to-faculty ratio 
meant nothing to them until they experienced it. There is no data set 
that you can establish that will capture the personalized-approach or 
research-intensiveness or student-life or religious-commitment that in 
the end compels students to attend and strive to graduate from a 
particular institution.
    My point here is not to disagree with the view that there are some 
important data points we might place in front of perspective students 
for an informed college selection. Rather, my point is that the 
selection process includes some very important factors that cannot be 
measured. And just as importantly, if we don't keep it simple, we have 
accomplished nothing but more costs for colleges and more confusion for 
the student.
    There are some better examples out there. In 2007, NAICU took all 
the ideas on consumer information swirling in Congress during 
reauthorization, and put them before focus groups of perspective 
parents and students. Attached is the simple, two-page profile on 
Shenandoah University that resulted from that process. It combines both 
elements of interest to policymakers and the information families told 
us they wanted. We are one of 827 NAICU schools that are signed up to 
participate in U-CAN. I'm citing this example not to promote U-CAN but 
to make two points: first, that less can be more, and second, that the 
needs and interests of real-world students must inform the development 
process or the end result can be the type of all-but-the-kitchen-sink 
approach we see emerging from the veterans committees.
    I'm afraid I may have raised more questions than provided answers 
today. If so, it is because of where we are in the process. The data 
question is really the tip of the iceberg of the more profound 
underlying questions I have suggested. I want you to know that college 
officials care deeply about our nation's educational future, and we are 
deeply appreciative of how much Congress has done to support our low-
income students' dreams to go to college. We welcome this conversation 
and are appreciative that you have asked us to be part of it.
    I understand how tight the federal budget is. I am even more aware 
of how tight the budgets are for so many of our students' families. The 
funding to make their dreams possible does not come easily, nor without 
painful tradeoffs, but it does make a profound difference in so many 
lives. We need to ensure that we are accountable to the taxpayers who 
provide our students with this critically important support. However, 
we also must ensure that any accountability measures are appropriate 
and helpful, and don't have unintended consequences.
    Let me thank you again for all you do for the students at 
Shenandoah University and for students across the country.
    Chairwoman Foxx. Thank you very much, Dr. Fitzsimmons.
    I want to thank all of you for the excellent material that 
you presented ahead of time for the record. I had a chance to 
read it. It is full of great information. And I want to make it 
clear that your written statements are on our Web site. They 
have been sent to all of our members. And I am going to 
encourage them very strongly to read them because it is 
    And I appreciate the comments that you have made today, and 
the fact that you did not read your testimonies, but spoke of 
the concerns that you had. So, I just want to make a point 
about that because your material is really excellent and I 
appreciate it.
    I wanted to ask all of you a question. It is a--the 
question is a little long. But I want to ask you to answer as 
quickly as you can. And then if you want to, again always 
submit additional information about it for the record, we will 
certainly get it in.
    Mr. Hinojosa. Will the chairwoman pause for just a moment? 
I have a question. Being that it is just you and me 
representing both sides of the aisle, would----
    Chairwoman Foxx. Go right ahead.
    Mr. Hinojosa. Would you--I see Chairman Kline just walked 
in. Excellent. You count for five, Mr. Chairman.
    Chairwoman Foxx. I was going to say, he is not chopped 
    Mr. Hinojosa. I looked over and noticed that it was just 
you and me. And now that Chairman Kline has come that may 
change the formula. But I was hoping that you would consider 
having a period of at least 30 to 45 minutes of questions to 
the panelists because this is very, very important to all of 
us. And I was just hoping that we would not just have one 
opportunity to ask 5 minutes of questions.
    Chairwoman Foxx. If we do not have a lot of other members 
come I certainly intend to allow for more than one round of 
    Mr. Hinojosa. Thank you.
    Chairwoman Foxx. If you show up, you get to do things.
    Now, back to my question; if you had to select five key 
data points the federal government should collect on all 
institutions of higher education, what would they be? Are these 
data points currently collected by the federal government, or 
by the states or by the crediting agencies?
    Is anybody--I do not want to pick on Dr. Schneider all the 
time. Is anybody prepared? Are you prepared? Please start, and 
then we will go down.
    Mr. Schneider. Sure. I will be more than happy to start.
    I think we could identify probably more than five. But I 
really think it is more the extent of the coverage that is 
fundamental, right. And again, I think that we--clearly 
completion rates is fundamental. I think retention, progression 
and completion are the suite that it has to be, this is a 
compelling national interest, a compelling student interest. We 
just need to make sure that it covers more students than at the 
current time.
    I believe that the other fundamental piece of information 
that is needed, and I will limit it to four, is two, student 
success in the labor market afterwards. And we can do that and 
we should do that. And I think, again, we need to have a 
broader coverage, if you will.
    I believe that we have some--not I believe, I know we have 
information, for example, from the Census Bureau or the BLS at 
the aggregate level, very high level. But students do not get a 
bachelor's degree. They get a bachelor's degree from a program 
in a specific university. Students do not get an associate's 
degree. They get an associate's degree from a program in a 
specific university.
    We need to know what the success rates of those students 
are at the program level. And I think we really--we can do 
that, and we need to do that.
    Chairwoman Foxx. Dr. Hallmark?
    Mr. Hallmark. I believe your question was what should be 
collected at the federal government level?
    Chairwoman Foxx. Yes.
    Mr. Hallmark. I think I will say--I am going to weasel out 
a little bit on this answer and indicate that I am not sure 
that this has to be done at the federal level. But I will say 
that we do need good information. And I would agree with the 
information that my colleagues here have mentioned.
    We need completion data, good completion data. And I think 
the word completion rather than graduation, graduate should be 
used because there is various ways by which one can complete 
their educational goals.
    I think we need to focus on at-risk populations, partly 
because that is where most of the federal money is being 
targeted, just by the nature of the process whereby one 
qualifies for financial aid. And so we ought to be looking at 
at-risk populations more carefully, seeing what their success 
rates are, what kinds of programs and opportunities are leading 
to their success and make sure that that is money well spent, 
and if not, redirecting that money in a manner in which it 
would be better spent to serve the goals of the state, nation, 
region and institution.
    Student success after graduation I do think is a valuable--
very valuable, very important metric that we all need to be 
paying attention to. And I believe that institutions are doing 
that much more so now than they were 5 years ago and 10 years 
    I do believe that progress is being made in that way. It is 
not, perhaps, a slam dunk at this point in terms of 
institutions having the kinds of data as to what their 
graduates are doing. But I do believe the institutions have 
significantly increased what they are doing in that area.
    Chairwoman Foxx. Dr. Cruz, the pressure is on. Not much 
    Mr. Cruz. All right. So, I would say extend the graduation 
    Mr. Hinojosa. Push the button.
    Mr. Cruz. I would say to extend the graduation rates beyond 
first-time, full-time students to include low-income students, 
transfer students and part-time students. Also to report 
graduates by financial aid status, those that do not receive 
Pell Grant programs versus those that receive Stafford Loan 
programs and those that do not receive either of those.
    Collect net price data for non-Title IV recipients, add 
private student loans to the National Student Loan Data System, 
and use the new completion flag to generate new cumulative 
student loan debt by institution.
    Chairwoman Foxx. Thank you.
    Dr. Fitzsimmons?
    Ms. Fitzsimmons. It depends deeply on what the information 
is used for. It is about accountability for Pell Grants. That 
is one data point that is important to the federal government. 
Otherwise, I am not sure that it is the federal government's 
job to collect that data on behalf of prospective students.
    If you wanted to do that, I would urge you to look at the 
Ucan Web site that already exists. That is www.ucan-network.org 
because this was created through deep focus groups with 
prospective students and parents and others interested in 
    What we know is that they tell us that cost is important. 
Graduation rates are important, except that that matters, that 
is impacted deeply by the average--the median salary of the 
families that are attending, whether they are first-time 
college goers that are going there, et cetera.
    Please, I urge you to not focus on graduation rates. In 
fact, there are a number of members of this committee that did 
not graduate from college. Clearly they have been highly 
successful. Others who went to institutions who have 31 
percent, 48 percent, 51 percent graduation rates here.
    I do not think you would want to put your alma maters out 
of business. In fact, they are doing a tremendous job. And they 
are taking risks on some who do not graduate, but at least they 
have been given the opportunity.
    So, I would urge you to think about whether it is the 
federal government's role. We already have a Web site for 
private colleges and universities across the country called 
Ucan that was built on focus groups, and gives the information 
to students, not just measurable data, but the other things 
they are interested in.
    There is a button. You can find out about the local 
community. Do you care about being in a big city versus a rural 
area? Does safety matter to you? Does what kind of spiritual 
life programs matter?
    Chairwoman Foxx. Thank you all very much.
    Mr. Hinojosa?
    Mr. Hinojosa. Thank you.
    Dr. Hallmark, thank you for joining us today. My question 
to you is, are there any unique qualities to the A&M project 
that distinguishes this data from similar dashboard efforts?
    Mr. Hallmark. Thank you. Appreciate that question. There 
are some. I want to carefully use the word unique because I do 
not know what might be going on in all places everywhere. But 
certainly there are some things that we are doing in the A&M 
System's metrics project that is unique.
    I think one of the things is our stretch goals where not 
only are institutions identifying where we are right now, but 
we are also identifying where we want to be in 2015 and 2020. 
And those are not necessarily increases or decreases, not 
necessarily predictable.
    For example, an institution may say we need to downsize a 
little bit because of the nature of what our expanding--our 
changing mission might be. For example, I provided my oral 
testimony of research--or excuse me, expenditures per full time 
student equivalent. And institutions such as A&M at Texarkana 
has downward expanded to include freshmen and sophomore 
students, which require a tremendous amount of support that is 
not necessary for success for an upper division and graduate 
    So, they are increasing their expenditures per full-time 
student equivalent. And so majors like that I think are very 
forward-thinking and thinking through what the specific mission 
is of the institution and how it can best serve its public.
    The only other thing I would mention, I would say there are 
several. But the only other thing I would want to mention at 
this time without further follow up is our excellence measures 
where each institution has said here is a core mission that our 
institution has.
    We are targeting let us say Hispanic populations, and we 
are specifically tracking that for that institution. And I know 
that is common data, but the point is the institution is saying 
this is important and we are going to track that and be held 
accountable for improving our performance in that area. And 
each institution----
    Mr. Hinojosa. That is interesting. And I am going to come 
back to you on another question in the next round.
    I would like to ask Dr. Fitzsimmons a question. If 
consumers are not informed and able to make comparisons across 
institutions, students' choices do not reflect a quality 
education. So, do not you agree that students can make more 
informed choices about universities when they have information 
on outcomes?
    Ms. Fitzsimmons. And in fact, Ranking Member Hinojosa, 
students do make those choices right now. There are a number of 
Web sites available where students can go and compare 
institutions. They can put in certain questions, you know I am 
looking for do they have this major, how much does it cost, is 
it a public or private, et cetera? So, they can do comparisons 
right now. It exists in a number of these datasets. Okay.
    I think the challenge is, is that so much of what students 
are interested in, it is difficult to measure in terms of 
quantitative data. I am a political scientist by training. We 
rely on quantitative and qualitative data. We ask our students 
to do that in classrooms as well across this country. We have 
to find a way for both.
    Mr. Hinojosa. I will come back to you with another question 
in the next round.
    Dr. Cruz, do you believe it is narrow-minded or 
unreasonable for the federal government to hold institutions 
accountable for the student's ability to repay loans?
    Mr. Cruz. I do not think that it is narrow-minded at all. I 
think that it is something that should be done. It is important 
for students to understand not only sort of the value or the 
qualitative value that it will get from an institution, but 
also the cost and the risks associated with not graduating from 
that institution because it might be that that institution is 
not doing as much as it could to support that particular 
    I do not think there is a disconnect between requiring 
institutions to provide more information to students, including 
the graduation rates for low-income students, and the ability 
of the students to also evaluate the qualitative aspects of 
particular institutions.
    I have a senior who is in high school right now, and he 
gets more mail than anybody in the household; every day at 
least three or four very shiny mailings from different 
universities across the country. So, the fact that we would be 
requiring institutions to be more forthcoming about how they 
are serving their students and the costs associated with that 
service does not impede that there would continue to be these 
other mechanisms by which the students can get a broader 
picture of an institution.
    Mr. Hinojosa. I share with you the feeling that your son or 
daughter is receiving more mail. I have the same thing with my 
young girl, 18 years old, in my household. And it is the same 
way there.
    I have more questions, but I will yield and come back at 
the next round. Thank you.
    Chairwoman Foxx. Thank you, Mr. Hinojosa.
    Chairman Kline?
    Mr. Kline. Thank you, Madam Chair. I want to thank our 
panelists for being here. It is not every day we get a panel 
where every name starts with doctor, but very impressive.
    I was thinking about my own college-going decision many, 
many, many years ago. And at the time it would seem fairly 
simple for me. I happened to be living in Corpus Christi, Texas 
at the time. And so I was looking at different schools. And 
sorry to say, I did not look a whole lot at Texas A&M.
    But you know I looked at the University of Texas. And at 
the time it was relatively inexpensive for a Texas resident to 
go to University of Texas. And so I was making my decision to 
go because it was relatively inexpensive. And then I was 
offered--got a letter from Rice University, and they offered me 
a full scholarship. And so free was better than cheap. And I 
went to Rice.
    So, I do not--I think it is not quite that simple now, 
although it may be for some families. They are looking for free 
is better than cheap, and cheap is better than expensive.
    But around here we talk about return on investment a lot. 
And so I want to go to Dr. Fitzsimmons first. Just because we 
are talking about return on investment does not necessarily 
mean that families and students are.
    And my question is do you think they are? And if they are, 
what is the return on investment? What counts? Is it getting a 
job? Is it the pride of being part of winning football teams? 
What is the return on investment? And how are they--if they are 
including that in their calculations, what are they looking 
    What--I am impressed by that huge stack of paper to your 
side there. I want to get at that also in a minute.
    What do you think? What do they--what are they saying on 
what return on investment means, if they are?
    Ms. Fitzsimmons. They are all talking about return on 
investment. The challenge is, is that how we each define return 
on investment is different. And so for a student who is 
entering into a music theater career, the return on investment 
might be that she does not graduate because after sophomore 
year she got an opportunity to perform in the national tour of 
Beauty and the Beast. That is a tremendous return on her 
investment if we prepared her for that.
    Or the student who decided to major in business and after 
junior year she had a phenomenal idea and she left to start her 
own company. I hope she will come back to college someday. But 
that might be a great return on investment that we prepared her 
to be an entrepreneur.
    For other families the idea of a great return on an 
investment is that their student will learn a lot and be well 
prepared for a career, while also being able to deepen his or 
her spiritual belief while in college. The return on investment 
for some is to have their child close to home because they need 
help in other ways on the weekends or on the evenings during 
the week. And for others it is that first kid that ever 
graduates from college in the entire family.
    The challenge is I do not think it is the federal 
government's job to define return on investment for the young 
people of our country; helping provide lots of opportunities to 
access data, certainly.
    And to what Dr. Cruz was speaking of, those kinds of data, 
I can tell you for Shenandoah University and so many other 
institutions across our country, right here in that big bottom 
part, that is the facts book. It is online. You can find 
answers to all those questions that you were asking for right 
on our Web site. So, if your son or daughter is a senior is 
interested, I would urge them to think about that.
    Mr. Kline. Let me interrupt if I could. Thanks for that 
answer. But I am curious as to why you said you hoped she would 
go back to college in your example. What would be the return on 
investment there? She left after a sophomore or junior year, 
was very successful. Why did she go back to college?
    Ms. Fitzsimmons. I guess I am one of those old fashioned 
people that believes that college is about both preparation for 
a career. That we would have done successfully in the example I 
gave you. But the other part is that we are preparing people 
for a lifetime of great decisions with the breadth of education 
    So, I think that a great course in political science helps 
us be better voters. I think it is important to know something 
about literature and science. And so I would urge her to come 
back to college so that she can broaden her thinking.
    Mr. Kline. I guess I am old-fashioned too. I see that my 
time is about to expire. Clearly, and so I do not want to get 
into another question here except that part of what seems to me 
we need to be doing is something about huge stacks of paper.
    We have had hearings here before where people from 
different college and universities came in with binders full of 
regulations, many of them federal regulations that they have to 
deal with that probably do not help a single student either get 
a degree or get a job. But it is a pretty big pile of stuff. 
And so we are always interested, many of us at least interested 
in how can we streamline things and make them simpler?
    Madam Chair, I see my time is expired.
    Chairwoman Foxx. Thank you very much, Mr. Chairman.
    The word got out that this was a very stimulating hearing 
and people are beginning to show up.
    So, Congresswoman Davis, you are recognized for 5 minutes.
    Mrs. Davis. Thank you, Madam Chairwoman.
    I appreciate you all being here. You know we are talking 
about how each individual student would really value the cost 
of their education, and how difficult that is. But I think that 
the other issue is whether or not the information that they are 
looking at necessarily reflects information that is required, 
which is problematic on the one hand.
    But on the other hand maybe it is just voluntary 
information so that there are some institutions or some other 
ways in which that information is getting out there, but does 
not necessarily reflect all the information that is out there 
so that it is a bit skewed. And students are having difficulty 
with that.
    So, getting to the question of what the federal role is, 
because surveys are not enough probably in getting that 
information. If students are evaluating whether or not this is 
really going to be the best thing for their buck. If they are 
comparing and they are trying to look at those issues and they 
are trying to think through whether or not they will have 
higher employment opportunities or they are going to be paid 
more versus another school.
    Is that important information to have? I know you 
suggested, and I would kind of go along with the fact that 
graduation rates per se may not be the best thing. Maybe it is 
not even how much they are going to be paid at the end, but 
whether or not they are actually employed.
    What is it that really gives a student the opportunity to 
try and fully evaluate over and above whether I want to go to a 
certain school, you know all the things that come into a 
decision that a student would make? I mean what is the bottom 
line for that, if a family is struggling with what they can 
afford and they want the biggest bang for their buck, what is 
it? And who gets that? How do you get that information to 
    Mr. Hallmark. I would be happy to provide some comment on 
that. It is difficult because every student who enters the 
institution comes with a very different goal. If you have got 
10,000 students in your institution, you probably have 5,000 to 
6,000 or 7,000 different goals.
    Some, and I mentioned this in my oral testimony, 
particularly at the community college level, are seeking a 
certification or a set of hours that they need in order to get 
job x. And that is a very different goal than somebody who is 
seeking a bachelor's degree so they can get into law school or 
something of that nature; or accounting degree to pursue a 
particular--so, I think our goals are much more diverse than 
simply graduation or persistence or whatever the case might be. 
And that makes the data collection process considerably more 
    I do not have an answer, but I do think that is an 
incredibly important part of this is that the students come to 
us with varying goals. And if we could figure out a way to tap 
what those goals are and plug them into some kind of 
measurement system then we would have a tremendous asset there.
    Mr. Schneider. May I follow up? Yes, of course students 
have very complicated goals in attending college. And we could 
actually obfuscate and we could do all kinds of crazy things to 
say we are never going to get to a core set of metrics that we 
could emphasize. But the fact of the matter is that the return 
on the investment is an organizing principle that could cut 
through a lot of of noise.
    So, the return on the investment is actually I am going to 
go to this school; what is my probability? So, it has to be 
individualized. What is my probability of graduating? How long 
is it likely to take me to graduate? So, that is my investment. 
How much am I paying, my net price? Okay.
    So, now I could actually, from the student perspective, I 
could actually figure out in fairly great detail what my 
personal investment in my education is. If we now extend this 
to what is a, my probability of having a job and b, what my 
likely outcome in terms of salary is, which we can do. Then we 
could get an organizing principle around the ROI.
    I agree that students have many more things that they care 
about. Some of them may be really core to the mission. Some of 
them may be just you know personally interesting, a football 
team; I like the orange of the Longhorns compared to whatever--
sorry, I do not know what TAMU's colors are. But anyway----
    Mr. Hallmark. Aggies.
    Mr. Schneider. Yes, Aggies, whatever. But I mean so those 
things matter. But from our perspective, I think from the 
perspective of data systems and the kind of issues that the 
Congress could deal with, I think that focusing on the ROI, 
computing it, making it easily available is fundamental to our 
    Mrs. Davis. Go ahead, Dr. Cruz.
    Mr. Cruz. I would echo what Dr. Schneider said. It is not 
about the extra data or qualitative issues around whether or 
not to go to college. There are just some fundamental questions 
that students should be able to get answers to. In the era of 
big data we should be able to provide every student an 
indication, a personalized, individualized indication of what 
the net price is, what the likelihood of this person graduating 
in X number of years will be, and what their ability to repay 
any student loans is. Those are fundamental questions that need 
to be answered, and are not right now.
    Mrs. Davis. I guess part of it is, how do you know that you 
are getting relevant information within that if it is not just 
by anecdote or survey? The requirement piece of it is what I 
think we are all struggling with.
    Mr. Cruz. So, there are various initiatives that have 
already been able to define the data and the metrics and the 
tools that can be used to have sort of a uniform view of these 
issues. So, the Complete College America initiative with NGA 
has been mentioned; the Access to Success Initiative as well. 
There are already institutions that are collecting this data 
and reporting this data. The one element that we are missing is 
making it go beyond the voluntary stage and having one place 
where people can see across the board comparisons.
    Mrs. Davis. Thank you.
    Mr. Platts [presiding]. I thank the gentlelady. I yield 
myself 5 minutes.
    I do not have I guess a specific question; I apologize. I 
was coming in from one hearing and as soon as the chairwoman 
comes back I have got to run to another one. But glad to have a 
chance to get your written testimony. Thank each of you for 
your testimony and also your work day in and day out on this 
issue. And I guess I look at the issue in two different ways.
    One is the individual return on investment, which is that 
job opportunity or that career, that entrepreneurial 
opportunity, and then the broader public return on investment. 
And really going back to our founding fathers, who in 
establishing some of our early institutions understood that as 
a new democracy a key to our success was having an educated 
citizenry if we were to be an effective democracy, and you know 
that return of investment is not a dollar amount.
    But just a citizens that are well-rounded, the liberal arts 
education approach, that they are going to be therefore more 
engaged in the process of governing. On the individual side 
there is certainly data I think is relevant.
    And the couple that I would highlight that as we look 
further into this, one that is very important to me, and what I 
am running is an Armed Services Committee hearing with the GI 
Bill and a record number of veterans now coming back into the 
education community that we are making sure that those true 
heroes of our nation have data to know that this institution, 
this degree is a good match for me, has a good record of 
success or assistance to veterans versus others that maybe are 
not as strong.
    So, that data that we can highlight that relate to the 
veterans community I think is key. You know the data that goes 
to an incumbent freshman, a traditional freshman probably more 
so maybe than others, but not necessarily, but is looking at 
the typical student at an institution.
    And I think it was in the Chronicle of Higher Ed just in 
the last edition or two where they highlighted that maybe as 
much--according to their--I think their review that about 25 
percent of institutions provided different SAT scores to the 
Department of Education than to U.S. News & World Report for 
their ranking of their college or university.
    That raises a concern that hey we are providing data, but 
why is it different that is going to the department versus the 
ranking entity that so many families look at to evaluate the 
quality of that school? So, there is data being provided, but 
not necessarily consistent data.
    And then a final and also is that was already mentioned is 
how do you evaluate that return on investment from getting a 
$50,000 a year education and having extensive debt for a job 
that you know up front maybe is a $25,000 a year job because it 
is in social services? And making that informed decision. Is it 
still worth getting that top tier education? Or should we you 
know, maybe look at a less expensive still good education, but 
less expensive?
    I mean, those are so many different factors that go into 
the type of information through the user that information is 
going to be, how they are going to use it. And you know taking 
a quick look at the written testimony you provide. And I 
apologize I did not get to hear your oral testimony here today. 
But you are helping with that dialogue of how do we hone in on 
    In the end, did it grow if it is an institution yourself, 
you know, to provide as open and transparent information to 
your possible students or those who attend. Or to the federal 
government as a provider of a significant sum of the taxpayer 
funds that we all kind of move the ball down the field that we 
get--hone this process as best we can to be as effective--
excuse me.
    I apologize. My first day back after being laid out for 3 
days, so still trying to get over this. Hopefully I am not 
contaminating anybody. I will not touch anything.
    I think I will wrap up because I am going to be able--I am 
not going to be able to continue to talk. And yield to my 
distinguished colleague from New Jersey, Mr. Andrews, for 
purpose of questions.
    Mr. Andrews. Well, thank you, Mr. Chair. I hope you feel 
    I apologize for not being here for your oral testimony, but 
I read your written statements, and I thank you for your 
    One of the problems that we have in higher education law 
right now is for purposes of tracking the value of our 
investment in Title IV money, we have two broad categories. We 
have gainful employment and everything else. So, if something 
fits the statutory definition of gainful, it is now subject 
to--sort of subject now to some convoluted new rules, most of 
which have been struck down by the courts about debt-to-income 
    It takes about 30 minutes to explain it. And I do not think 
it is a very good idea, frankly. And then we have everything 
else, you know, from a PhD in philosophy at Yale to a community 
college degree in accounting and sort of everything in between. 
Do you think that we should create some new gradations in those 
categories that would differentiate among the type of data we 
    The premise of my question is this. I do not think you 
measure the quality of Yale's PhD program by what the PhD 
doctoral graduates do. I do not know how you measure that.
    Speaking as a parent who has a daughter who might go to 
graduate school, I know you cannot measure success on these 
bases. But I think you certainly can measure an auto mechanic 
certificate training program as to whether someone gets a job 
as an auto mechanic.
    There is a lot of in between. Do you think we need some new 
categories in the law to help us differentiate both in the 
collection and interpretation of data?
    Mr. Schneider. So, I--the dreaded gainful employment word 
has finally come up. I think that to me was pretty much of a 
debacle. I think what happened in that was that there was a 
regulatory attempt that was far in excess of the ability of the 
data to support it. I believe fundamentally that--I understand 
the legislation and I understand why this was restricted to the 
for profits because of the terminology in HEA. But the fact of 
the matter is that I do believe that we need those data, the 
same kinds of data for every program in the nation. And I 
believe that we have----
    Mr. Andrews. I completely agree. Uniformity of treatment on 
    Mr. Schneider. Uniformity of treatment.
    But let me go back to I think what is inherent in your 
question. And that is whether or not this is an informational 
exercise or a regulatory exercise. I think what happened with 
GE was that we put regulation in front of getting the data 
right. And this to me was really a fundamental misstep. And 
again, I understand you know the Department of Education 
regulation and, you know, the legislative language. But the 
fact of the matter is I do not believe that we are quite ready 
for the regulations.
    Mr. Andrews. By the way, the courts echo your opinion. 
Their opinion in that case was that the department has the 
regulatory authority to do this, but they lack data to justify 
the rule that they in fact put forward, which is kind of the 
premise of my question. What data should we be collecting 
across the board?
    And again, I want to say this; that I do not think you can 
measure a philosophy PhD program the way you do an auto 
mechanic training. You just cannot. And what I am interested in 
are your thoughts about what we ought to be measuring, for whom 
and for what purpose?
    Well, ultimately--one of my favorite movies is the movie 
Accepted. Ever see this? This guy does not get into any good 
schools, so he invents a college. He has got this geeky friend 
and they invent a college so his parents are duped into 
thinking he is at a real college.
    So, they have to put on this big front when the parents 
show up the first day and one guy's drunk uncle becomes the 
dean. This never happens in real life does it? And the parents 
walk in. And the father asks the dean what the purpose of the 
college is. And the guy starts sort of a long-winded 
explanation. The father rolls his eyes. You can tell he is 
ready to leave the campus.
    Finally, the dean says, you know what? The purpose of this 
college is to get a job. Well, the father loves this. Okay. The 
father thinks this is the reason for being for higher 
    What answer should be given to that father? What data 
should we be giving him and his spouse and his son or daughter?
    Mr. Hallmark. I cannot speak to the gainful employment; do 
not know the law well enough to comment. But I do think I want 
to go to your broader issue there, and that is how do we assess 
a PhD or a bachelor's degree in history or some of these very 
important historical liberal arts kinds of degrees that right 
now are not as fashionable because they do not lead to a 
specific job as clearly as say auto mechanic kind of 
    I think one of the things that we have to do is think 
longer term. When you look around, say, at a bachelor's degree 
in history, you are not looking at a gainful employment 6 
months or a year or 2 years down the road----
    Mr. Andrews. Maybe never, huh?
    Mr. Hallmark. Maybe never. But that student may very well 
be a great poet, or may very well be--goes to law school when 
they are 35 and becomes an elected official. I mean, there are 
so many variables there that you cannot look just at the 
immediate outcome, but rather look longer term. And also going 
to the issue of the engaged citizenry, something I believe in, 
that we have a full range of outcomes that are not dollar-wise 
    Mr. Andrews. If I may, my time is about to close. I want to 
thank the chairwoman for having this hearing because it is a 
really dry, sterile topic, but it really needs to be looked at. 
Because we are investing billions and billions of taxpayer 
dollars every year, and we have very little idea what we are 
getting for it.
    Now, I think intuitively it is a pretty good thing. And I 
think one of the reasons this country is the strongest country 
in the world, is it has got the best higher education system in 
the world. But I would like to be able to hone in on that and 
really understand a little more surgically. And I am convinced, 
as I think each of the witnesses have said, that the present 
data sets are in some cases over-inclusive, in other cases 
under inclusive.
    They do not really let us make the kind of diagnosis that 
we need to make. And I think it is a very serious and very 
relevant topic that we can work on together to find data 
collection that is not overly burdensome on the institutions, 
but very valuable for the students and the taxpayers. Thank 
    Chairwoman Foxx. I want to thank Mr. Andrews. And early--
late last night, early this morning I was reading an article 
called ``Who Killed the Liberal Arts'' by Joseph Epstein. I 
think it is in the Weekly Standard, recent Weekly Standard. And 
I want to commend it to you. It is--I was laughing out loud----
    Mr. Andrews. I rarely read the Weekly Standard----
    Chairwoman Foxx. Well----
    Mr. Andrews. I will make an exception in this case.
    Chairwoman Foxx. It has some really good articles in it. 
And this is a good one. But, I have to say, I do not know how 
many people are going to appreciate all the humor that is in 
it. There is a good bit of humor in the article. But I 
recommend it to you.
    I believe now that we will do what Mr. Hinojosa wanted to 
do, which is to have a second round of questioning. And I would 
like to ask Dr. Schneider, given the constraints of current 
federal law, what options are available for the federal 
government and states to collect and provide useful post-
secondary data without infringing upon the privacy rights of 
students and their families?
    Mr. Schneider. The balance between what the states can and 
should do and what the federal government can and should do is 
one of course that is constantly evolving. And I think it is a 
very important discussion with no fixed answer, right. We have 
been at this for hundreds of years, and we will be at it 
hopefully for another hundreds of years trying to figure out 
the appropriate balance.
    In terms of the federal data collection, at the current 
point in time I think that in the reauthorization of HEA I 
think that is an ideal opportunity for the Congress to start 
cleaning out the IPEDS attic. There is just stuff in there that 
you know may have been important at one time or seemed 
important at one time. And I think that we really need a 
systematic effort to do this. And I think we need to ask the 
question, what is the compelling national interest in 
collecting data?
    And you know I--so you can tell that I did not even know 
what color the Aggies wore. So, I am not really much on sports. 
But you know we collect a lot of data on athletes and 
athletics. Why? You know what is the compelling national 
interest in that? You know our HR collection is, again, needs 
to be rethought.
    I think that--but again, I am going to lay some of this 
blame on the history of legislation. And one of the things that 
I think is fundamental is an inventory of what is the 
legislative requirement and what are just sort of things people 
made up along the way?
    And if we want to clean out some of the attic then I think 
that legislation in the HEA, I think we may have the 
opportunity to get rid of some of the stuff. And I believe it 
is--you know given how long it sometimes takes to reauthorize 
HEA, maybe once in a lifetime. But we need to--we need to--I 
think we need to systematically work on that.
    I also think, and this goes to the issue of burden. We now 
require a school with 100 students to report all the same data 
elements as a school of 60,000 or 100,000 students. And some 
schools, given the fixation on first-time, full-time students, 
there are many schools that have a handful of those. So, why 
are we forcing schools to report all the data on students that 
do not exist? And I think again what we really need to do is to 
be more mindful about the diversity in the admission and size 
of campuses.
    And finally, as the ex-commissioner of NCES, I want--I 
really believe that that is a gem. They actually--you should 
engage them in a discussion about what they see on the ground 
about what is useful, what causes the campuses to pull their 
hair out you know? So, I think that that is a discussion that I 
think needs to be undertaken.
    I think the states obviously have a lot more skin in the 
game than the federal government in terms of supporting 
especially public institutions. And I think that we have to 
be--you know when I was at NCES you know we dealt with states 
all the time. And I just sort of, cannot you understand? Can 
you not understand? Do this, do this, do this; and it is of 
course that is an unhelpful attitude.
    The states actually have incredible--you know incredible 
amounts of money in this. They have incredible regulatory 
authority. And actually they are wonderful partners. It is just 
longer and harder. But we have to be respectful of states. And 
states actually have control of so much more data than we have.
    For example, wage data, unemployment insurance data. They 
could link it at the current time. I think we have to explore 
those and then figure out what the limits of that are and build 
on that because for example the work that I am doing in 
Tennessee and Virginia using the state linked Student Unit 
Record and the unemployment insurance data we can match half of 
the students.
    So, right now--it is not right now. We can do this right 
now. We are doing this, and reporting the wages from every 
program in a bunch of states. We can do that right now.
    But there are these problems of coverage. And it may be 
ultimately that the nation decides that we really need a much 
more national approach to that kind of linkage than just 
relying on the states. But the states can and should do that 
right now.
    Chairwoman Foxx. Thank you very much.
    Mr. Hinojosa?
    Mr. Hinojosa. I am going to ask a short question, Dr. 
Hallmark. Once you have the data that we are asking that you 
all collect, then what? And how do you transform that data into 
meaningful information for parents and perspective students?
    Mr. Hallmark. I was struck by the huge stack down here and 
I think that is part of the problem is that when you have a 
fact book that thick, which I think we probably all do, it is 
so overwhelming to making any kind of decision. You do not 
really know where to go, how to approach that.
    And so what the institution, state, whatever it might be, 
has to transform that data into something that is more 
meaningful to the decision maker. And I would say that is not 
just limited to a parent or a student, but that is true of a 
president or a provost or a dean as well. The data is 
overwhelming that already exists there and so it must be 
    What we have done in the A&M System is we have gone through 
and decided what metrics we believe are most important. And 
importantly we did that at a grassroots level and from the top. 
We had it going on, and fortunately it worked out quite well, 
so far anyway.
    But we have had the institutions of folks through 
committees of faculty and staff to say what is it that is so 
important that we want to be tracking and measuring so that we 
can demonstrate that we are furthering our mission along and 
serving these students better? And we have done that at the 
institution level, at the system level so that we are focusing 
on those specific metrics instead of the huge stack. And I 
think that is a very useful way to go about it is identifying 
what it is.
    Mr. Hinojosa. Thank you for answering my question. Do not 
be overwhelmed by that stack because in the first panelist's 
statement, Dr. Schneider, said the federal government spends 
billions and billions of dollars in higher education. And he 
referred to the 1950s and all the way through the 1990s.
    But I want the record to show that unfortunately minority 
institutions of higher learning were getting the lowest 
percentage of those billions and billions of dollars. And I 
will give you something very specific.
    In 1992 we created under the Higher Education Act the 
Hispanic Serving Institution designation. And the amount of 
money given to the HSIs was zero. In 1994 the appropriators 
gave them zero. I came here in 1996 and they were getting $11 
million for 36 HSIs. And in 1997 they got $11 million for 37 
    So, as you can see, it was very, very little. So, now that 
we have a stack that size, I can tell you that we are doing a 
little bit better for all minority institutions of higher 
learning, including HBCUs, HSIs, and Asian-American colleges. 
So, we need some data collected now that there is much more 
    Okay. And few said that, as of 2010 and 2011, Hispanics 
make up the largest minority group that is now attending 2-year 
and 4-year universities. So, we need more data like that so 
that we can see just how women and minorities are being served 
in higher education.
    We have a long ways to go to close that gap. But I am so 
glad to see a woman and a Hispanic were chosen to be our 
panelists. So, thank you for that, Chairwoman Foxx.
    It is wonderful to hear all of you speak. But especially to 
have the woman's point of view and the Latino's point of view 
because that is the group that is really, really growing: about 
55 percent of college graduates are women today, and about 55 
percent or 56 percent of voters are women.
    So, we need to really pay attention to why this data is 
being asked for by this panel--I mean, by this group of 
congressmen who have been attending the congressional hearing. 
It is a pleasure. And I will ask the last question of you, Dr. 
    I recognize that 88 percent of your students in your 
college are 24 years of age or younger. However, single mothers 
going back to school part-time to increase their skills do not 
have a counselor like my daughter had. These students rely on 
user-friendly data that allow prospective students to compare 
institutions. How would these tools inhibit choice of college, 
they selected?
    Ms. Fitzsimmons. Of course, the challenge, Congressmen, for 
those single women raising children is that they frequently are 
not mobile. So, they have very limited choices. And they tend 
to focus on their local community college or the 4-year 
institution that is closest to them.
    Mr. Hinojosa. That is true.
    Ms. Fitzsimmons. So, the real question is what kinds of 
programs do we have available to be supportive to them? And 
frankly there are men who are in their position right now also.
    We have got a veteran who is about to graduate from 
Shenandoah. He is a male. He is obviously an older student and 
he is recently divorced and he has primary custody for two of 
his young children. He is working and he is taking 19 hours, 
credit hours. And you know what? He is going to graduate with 
all A's and B's on his transcript it looks like.
    But the question to all the colleges and universities 
should be not only what kinds of data do we put out to help 
them make decisions, but how can we help them through? What 
kinds--do we have childcare available? We have a childcare 
center at Shenandoah University available to faculty, staff and 
students, subsidized for them.
    Many institutions, including my institution have special 
tutoring programs available to help. The chairwoman has started 
in her comments, and you echoed those as well, that the 
traditional student is not the typical student.
    Our typical student in American higher education is older, 
is interested in some type of online learning, has some high-
risk challenge. That might be that they have learning 
differences. It might be a student who has Asperger's who's 
going to school and needs some special support. It might be 
that they are a first generation college student and nobody is 
cheering them on. And so how can we all create that family 
environment for them?
    The typical student looks very different now than it did 50 
years ago. You should be encouraging us and finding us and 
finding out what we already do because it is amazing the 
programs offered to support students who are typical, but not 
    Mr. Hinojosa. I yield back.
    Chairwoman Foxx. Okay. We are going to try one more round.
    Let us go back again to looking a little bit at the state 
and what the states are doing now. And if you all might respond 
to this; what factors are there that make up the high-quality 
state--state longitudinal data systems? And what factors are 
missing from the low-quality state data systems?
    I think, Dr. Schneider, you said that some states are doing 
very well; others are doing it, but are not publishing it. Do 
we know why there are those problems with the states? Are there 
people in the state who are--are certain people resisting 
putting out that information or developing the systems? Is 
there any kind of pattern that you have seen?
    Mr. Schneider. Yes and yes and yes. I do not know how many 
compounds there were in your question, but the answer is yes to 
almost all of them.
    I think one of the mistakes that we made, and remember, we 
are $700 million into this process, was that we did not have a 
use requirement. And there is a long history on this. There was 
no use requirement on this data.
    So, what has happened is that we have made this huge 
investment in these data warehouses, which I think of--I 
sometimes call them data mausoleums and going back to many 
years ago I think of them--you remember there was something 
called the roach motel. You know roaches checked in, but they 
never checked out. So, sometimes I think these data systems as 
the equivalent. You know data checks in and we never see them 
    So, we spend a lot of money on building these and actually 
very, very few concrete products that actually can and should 
help us inform consumer choice. So, and I must--you know some 
states are stepping up. And I spent a lot of time trying to get 
states to open up these data warehouses to make them available.
    I am not the only person that is doing it. But you know I 
am one of the people out in the forefront. And Texas has 
stepped up and Virginia has stepped up. These are partners of--
with us. It is a long, complicated process often because of the 
politics of data. It is not that the data do not exist. It is 
that the data do exist. And sometimes the results are not--do 
not make people all that happy.
    So, in the Tennessee data, which we released earlier this 
week on Tuesday, there are--there are graduates from programs 
in the state of Tennessee where the average earnings of 
bachelor's degrees are $25,000, $22,000. And I think there are 
others in the exact same fields that are $35,000 and $40,000. 
So, some people have actually asked several times about, how do 
we compare across schools?
    So, in my written testimony I talk about risk adjusted 
metrics. And I think one of the things that we need to do is 
mount a serious effort to get risk adjusted metrics right. And 
we can do it. Hospitals do it all the time, right? And I think 
that institutions of higher learning need to do this also.
    But, I think that they are--at the end of the day there are 
programs that are going to do way better than other programs. 
And we could adjust for the characteristics of the students or 
we could adjust for the regional labor market. All fair points 
and we have to--and ultimately we have to do this. But 
ultimately there are programs that are just not up to the 
    Now, this is where the state--this is where we end up with 
the state versus the federal government in information versus 
regulation, right? So, this is at the junction of all these 
fundamental, these fundamental issues. Again, I believe that 
the federal government missed an opportunity by putting--by not 
putting use requirements into these data systems. And I care 
most about the higher ed. But it is the same thing with the K-
12 system.
    So, we missed that and we missed an incredible return on a 
lot of money invested. And I believe that in the regulation 
versus information side, we are not at the regulatory stage, 
right. And I believe that the GE process told us that we are 
not ready for the regulation. But I also believe that we need 
to figure out how to get this information in the hands of 
consumers in a usable way.
    Now, that may be the state. That may be a private entity. 
There are all kinds of private entity companies out there that 
do big data, trying to push information out to students, guide 
them in the right way. And right now that may be as good as we 
can do, right. But I do not think we are at this junction 
between federal, state and information versus regulation. I do 
not think we are at a point where we can say this is federal, 
this is regulatory.
    Chairwoman Foxx. Thank you very much.
    Mr. Hinojosa?
    Mr. Hinojosa. Thank you.
    I want to ask a question of Dr. Cruz, following up on one 
of the last comments that Dr. Fitzsimmons made about that young 
man who is single parent, with two children and carrying 18, 19 
    We have a problem in the state of Texas in that it is 
taking our students 6 or 7 years to graduate with a bachelor's 
degree. And there is talk about how we can improve that. What 
kind of incentives would you give us, Dr. Cruz, for colleges 
and members of Congress to promote so that we can graduate them 
in four, but not more than 5 years?
    Mr. Cruz. The actual incentives I think need to be informed 
by the data. And it brings us back to sort of some of the 
issues that we have been talking about today. So, we know that 
certain colleges and universities do better for their students 
than what similar institutions do for theirs.
    So, it is important that as we define what the new data 
requirements would be, or the new additions that we would have 
to fill some of the gaps in the current databases, that we work 
hard to ensure that the data that is collected will then allow 
policymakers and students and their families to be able to see 
which institutions are doing a better job to educate students 
that look like them.
    So, I am a Latino. I just graduated from high school and I 
have a certain socioeconomic background. And I am contemplating 
going to a particular school. I want to know if that school has 
the supports in place that will allow me to complete and will 
allow me to complete in a timely fashion, in 4 years if that is 
my goal. And it will allow me to complete in such a way that it 
will eventually, the value of that degree will allow me to get 
a job and to be able to pay back my debts.
    So, we need to be able to have the data to be able to then 
devise the incentives. And going back to the gainful employment 
regulation discussion previously, that is something that the 
federal government tried to do for the for-profit sector.
    The numbers of the sector as a whole indicated that 
something needed to be done. It is a sector that enrolls 
currently 13 percent of all the post-secondary students in the 
U.S., yet takes in 24 percent of the federal financial aid 
dollars and produces 43 percent of the student loan defaults. 
So, it made sense from that view of the data to try to figure 
out a way to identify those schools and programs that were not 
doing a good job.
    Mr. Hinojosa. That data certainly would help us. But let me 
just say this from experience, that some of our Latino 
students, men and women, who are given an internship while they 
are going to college and work 10 hours, 15 hours, no more than 
20 hours often times can make it and they balance their time to 
be able to do it.
    How could we have at least a third of the jobs that are 
available by the college or the university at the library, 
working and serving at the mess hall or wherever, and giving 
them that opportunity so that they could accomplish the goal 
that I am asking for, graduate in 4, 5 years? Can that be done?
    Mr. Cruz. That would definitely help them. I mean, I think 
that what we have seen is that the institutions that are 
intentional about graduating their students and figure out what 
are the types of supports they need to put in place. And one of 
them is really to help their students----
    Mr. Hinojosa. So, it is possible?
    Mr. Cruz [continuing]. More time in their schools.
    Mr. Hinojosa. Thank you.
    I wanted to ask Dr. Hallmark, Texas A&M was all men back in 
the 1950s. And now it is both men and women. And I want to talk 
about women in STEM majors, STEM careers. Because they are 
graduating much higher than 50 percent of graduates, what is 
A&M doing to recruit and help graduate women in STEM majors so 
that they can have those careers?
    Mr. Hallmark. Diversity at our flagship campus in College 
Station is not what it needs to be, and is a significant focus 
of the institution. I have a vice chancellor, in fact, of the 
A&M System is now devoted to diversity and recruitment efforts. 
That is focused primarily on College Station.
    I am not personally familiar with all of those programs, 
but I would be happy to get back with you on that. I do not 
know the details of them well enough----
    Mr. Hinojosa. If it could be done at College Station, could 
not it be done in the other 11 satellite campuses?
    Mr. Hallmark. Diversity is not as big a challenge in those. 
We have good Hispanic female numbers, for example. In fact, at 
one of our campuses NSF right now, National Science Foundation 
is very interested in our success with female Hispanics in 
engineering. So, we are looking at how that is working well in 
our regional campuses, and we are also seeking how we can 
improve that at the flagship campuses.
    Mr. Hinojosa. So, there is hope?
    Mr. Hallmark. Yes.
    Mr. Hinojosa. Thank you.
    Thank you.
    Chairwoman Foxx. Thank you, Mr. Hinojosa.
    I want to thank our distinguished panel of witnesses for 
taking time to testify before the subcommittee today. As I said 
to you earlier, I think we are beginning a long journey on 
this, and others of you have alluded to it in preparation for 
the reauthorization of the Higher Education Act. But you help 
us make the first steps, and I appreciate that.
    Mr. Hinojosa, do you have some closing remarks?
    Mr. Hinojosa. Yes. Yes, I do.
    In closing I also want to thank all our panel of experts 
for sharing your views on this issue. As we look to improve 
data on post-secondary education and reauthorize the Higher 
Education Act in the next, or the 113th Congress, it is 
important to closely examine what types of data are most 
useful, relevant and user friendly to consumers and 
institutions of higher learning.
    And with that I yield back.
    Chairwoman Foxx. Thank you, Mr. Hinojosa. I do not know how 
much the panel knows about my background, but I worked at 
Appalachian State University for many years, handled--was an 
Upward Bound special services director. I did academic advising 
and orientation for new--for freshmen as well as transfer 
students. I became a president of a community college.
    I have been frustrated by this issue all my life. And am--
thought years ago that we were going to do better at solving 
the issue of data collection and understanding what our needs 
were. The language of the--of your presentations and the 
material I have been reading brings back a lot of memories to 
me of concerns that I have.
    I appreciate very much the emphasis that you have put on 
the issue of completion. Again, having seen students leave 
colleges without a degree, but having fulfilled needs they had 
and going on to be successful is an experience I have had. So, 
I agree with you on that.
    I agree with you talking about the need to deal with at-
risk students. I am particularly concerned with the point I 
think Dr. Schneider brought up that we spend more than twice as 
much money as any other OECD country. I read that in your 
material and I made a note to say something about it. I want to 
give you one little example of my own experience at the 
community college where I worked.
    I went in one day--I do not remember exactly how long I had 
been there, but I went into the registrar one day and I have--
we served three counties, primarily three counties and we had 
three high schools. And I went into the registrar and I said I 
would like to look at completion data by high school. And she 
said to me, we do not record which high school the students 
come from when they enroll here. I almost fell on the floor.
    I thought we collect thousands of pieces of information. 
And to me the most basic piece of information you would have 
collected would have been which high school are these students 
graduating from? And we did not do that. It blew my mind. It 
just--you know I am not a statistician although I love dealing 
with data. And that just shocked me.
    And I think, again, that the American public is probably 
very--would be very surprised to hear so much of what you have 
talked about today. We have so much data, and we seem to know 
so little. What a tragedy for all the money that we are 
spending in this country. Yes, we have I think the greatest 
higher education system in the world. And I want to see it stay 
that way.
    And I also want the consumers to get the best information 
that they can get so they can be making good decisions. So, it 
occurred to me as you all were talking, especially Dr. 
Schneider, maybe we need a consumer union like the consumer 
union we have on products out there for people. Maybe somebody 
will start a consumer union and they will test all these 
products, meaning the universities and colleges. And then 
publish real reports that tell people what is going on.
    But anyway, I want to say thank you all very much for being 
here today and providing such enlightening testimony, both 
written and verbally.
    There being no further business, the committee stands 
    [Questions submitted for the record and their responses 


    Dr. Fitzsimmons' Response to Questions Submitted for the Record

    1. If you could eliminate any data collection or reporting 
requirements currently collected by the federal government, which ones 
would you select and why?

    If I could eliminate one regulation, it would be the federal 
definition of credit hour. Let me say that I am deeply appreciative of 
the work of this committee in moving repeal legislation through the 
House. This federal definition is causing havoc for accreditors, is 
unnecessary, and not an appropriate federal role. The whole debate 
makes one feel like the federal government doesn't know how to enforce 
the rules it has, or to detect fraud and abuse, so is flailing around 
into academic issues instead.

    2. Are there data elements your institution, state or accrediting 
agency is collecting or reporting that are different from what the 
federal government currently collects or includes on College Navigator? 
Do you have suggestions about ways in which the federal government 
could streamline its data collection and reporting requirements with 
what you are already required to provide to your state or accrediting 

    It is difficult to answer this question precisely, as the 
individuals on campus who are responsible for collecting data often do 
not know the source of the requirement.
    There are, of course, some differences. For example, one regional 
accrediting agency asks for Fall full-time enrollment (FTE), while 
IPEDS asks for an annual unduplicated FTE. The accreditor also requests 
more detailed information about faculty than is requested by the 
federal government.
    In general, slight differences in reporting requirements among 
different entities are not the main concern I've heard--particularly 
when the reason for the difference seems clear (i.e. an accreditor does 
have reason to take a closer look at faculty). Rather, the concern is 
the sheer volume of requirements--and the fact they grow with every new 
law and every new regulation and every new departmental interpretation.
    Even if overlapping requirements were the biggest problem, attempts 
to address it could well lead to a cure that is worse than the disease. 
I say this because it seems that the only way to achieve consistency 
among the federal government, states, and accreditation agencies would 
be to develop uniform requirements. Of necessity, this would have to be 
done at the national level. I'm not convinced that would be wise--as 
the likely outcome would either be collecting too little information 
that a state or accreditor may need for a specific purpose or 
collecting too much, simply because the federal mechanism for doing so 
is in place. To a great extent, I believe this has happened with 
College Navigator.
    Also, taking on the overlaps among the federal government, 50 
separate states and the multitude of regional, national, and 
specialized accreditation agencies would quickly become overwhelming. 
It would make more sense to begin by looking at the overlaps within 
federal requirements or at the overlaps within the Department of 
Education itself. A good start would be funding the National Research 
Council study authorized in the Higher Education Opportunity Act--as 
has been done in the Senate version of the appropriations bill for the 
Department of Education. In addition, the idea presented by Mark 
Schneider at the hearing regarding ``cleaning the attic'' with respect 
to IPEDS data is certainly worth exploring.

    3. Are the federal government, states, and institutions currently 
providing the information students and their parents really want and 
need to make the right postsecondary choices? If not, what information 
should they be providing? Do you believe that students and families can 
be provided with too much information on their postsecondary options? 
In other words, is too much data problematic or confusing?

    As illustrated by the pile of reports regarding Shenandoah 
University I brought to the hearing, there's plenty of information out 
there. Different people want to know different things, and governments 
and institutions try to be responsive. However, once the government 
``starts a list,'' so to speak, it gets hard to resist the temptation 
to add to it. College Navigator, which can be a great resource, can 
also be overwhelming. As I said in my testimony, it's TMI!
    Another example is the new reporting requirements coming out of the 
Department of Veterans Affairs, possibly soon to be augmented by 
legislation in both the House and Senate. I wholeheartedly support our 
providing information that helps veterans to make smart choices, but 
presenting them with mounds of minute detail is not going to help. If 
we could at least use what we have before we require even more, we 
would have made a huge step in the right direction.
    Now, on top of that, we're getting three new administrative 
initiatives: the College Scorecard, the Financial Aid Shopping Sheet, 
and the Aid offer Comparison Tool. Colleges are being asked to adopt 
them basically ``sight-unseen.''

    4. Are there more appropriate or accessible ways the federal 
government can present outcome data to students, parents, and 

    Yes, and I think this would be a great federal role. We have so 
many great colleges in this country, and so many different types, 
families can be overwhelmed with the choices. It would be great if the 
federal government could help colleges and students find each other.
    This question was posed in the last reauthorization of the Higher 
Education Act. In response, NAICU worked with the staff on this 
committee and the Senate education committee to develop what became U-
CAN. We gathered the ideas coming out of the Administration and 
Congress, consulted broadly with our members, and ran focus groups of 
parents and students.
    A big challenge, per your earlier question, was keeping the amount 
of data presented to a reasonable level. We addressed that by setting a 
firm rule that the final product could not exceed 2 pages; and we 
dropped ideas if parents and students weren't interested in the 
information. Other items, such as the community information on page 2, 
were added because that was something they wanted that hadn't appeared 
on any of the policy lists.

    5. I agree with a lot of what you said during the hearing in terms 
of making sure institutional accountability and academic freedom remain 
in place. However, I'm also mindful of the vast amount of taxpayer 
dollars the federal government devotes to higher education. Many 
policymakers question what we are getting for that investment. How 
would you answer that question, particularly during a time when we have 
to make some tough choices about the future of student aid programs?

    First and foremost, I want to say how deeply appreciative colleges 
are for all Congess has done to stick by low-income students during the 
recent economic downturn. As I said in my prepared testimony, I KNOW it 
is working because I know the students it has helped. I also know the 
federal budget is in a very, very difficult hole and every dollar has 
to be spent wisely.
    I will skip the soft answer here--the one about human dignity and 
dreams and democracy--although I think it is most important. When you 
have to justify federal spending, you need some hard facts. So, here is 
what you are getting for our maximum Pell Grant investment of a little 
more than $22,000 for the poorest student to get through college in 
four years:
     Demographically, that student is likely to be the first in 
his/her family to go to college. If either parent had gone to college, 
it is likely he or she wouldn't be poor enough to qualify for the 
     That student is exponentially less likely to ever need 
federal assistance in all its forms, if he or she finishes college, and 
the change is generational. In other words, the graduate's children are 
less likely to ever need government assistance, including Pell Grants.
     On average, the student who completes college will earn 
more and pay more taxes--past census data long ago put the figure at a 
million dollar differential in lifetime earnings, with varying private 
analyses claiming higher or lower differentials. A million dollars more 
in earnings on a $22,000 federal investment at even a 15 percent tax 
rate provides an awesome return.
     It is estimated that increases in national educational 
attainment have accounted for almost 30 percent of the growth in 
national income in the 20th century.
     Your federal investment leverages many other resources: 
state aid, institutional aid, and private scholarship money. You are 
only the first domino in our national effort to let those who work hard 
earn their way out of poverty.

     Dr. Hallmark's Response to Questions Submitted for the Record

    1. Are there data reporting requirements I would delete?

    In response to this prompt, I surveyed the Institution Research 
officers within the A&M System and none of these officers reported any 
difficulty in easily and promptly reporting the necessary data 
(referring to NCES/IPEDS data). Current Federal reporting is not 
    Furthermore, current data reported through NCES/IPEDS is useful for 
making decisions. As is noted in response to subsequent questions, we 
do believe some improvements are warranted.

    2. Differences in state, system, federal reporting, and 
recommendations for improvement, including additional data to be 

    I solicited responses from A&M System Institution Research 
officers. A few observations from these directors (arranged and edited 
for clarity):
     IPEDS focuses on students' headcount rather than on 
students' course-taking activities, such as semester credit hour 
production (SCH).
     SCH data helps predict/understand graduation/persistence 
rates and other performance measures.
     SCH data helps institutions prepare for course loads/
resource allocation.
     Headcount is not particularly useful in charting either 
student performance or resource allocation.
     Measures of student success are limited in IPEDS to 
persistence and graduation. The A&M System Analytics project delves 
into more measures such as engagement and learning outcomes.
     The IPEDS 12-month period doesn't exactly fit the normal 
academic year defined by a typical public semester university. It also 
does not use the same period as in our state reports. This requires 
effort to consolidate enrollment of semesters and sessions.
     Data definitions between the Texas Higher Education 
Coordinating Board and IPEDS are not the same. They are similar, but 
not identical, requiring adjustment as they report the same data to the 
state and to the Federal Government (race/ethnicity is a good example).
     If data definitions are aligned, it would be useful to 
have the option to enter at the State level and let it feed up to the 
Federal level as a batch State submission or conversely from the 
Federal level down.

    3. Are we providing information students and parents find useful? 
Is there such a thing as ``too much'' information?

    The information currently collected and provided is useful for 
students and parents. I will, however, assert the information is not in 
a useful format nor are the terms easily understood, particularly for 
first generation students/families.
    The data, for example, may only include students who are first time 
in college. Students may assume based on the data that the ``average'' 
student will graduate in X number of years. However, the reported data 
only applies to first time full time enrollees, which may constitute 
only a portion of the total student population (in many cases, a small 
portion of the student population). A student could discern this 
caveat, but is not likely to do so without significant prior knowledge 
and understanding of how the data works, and even then only with 
knowledge of what other data to consider.
    A project more useful to parents and students than collecting new 
data would be to transform existing data into more user friendly 
formats. I do believe the current IPEDS data reported/collected is 
useful for parents and students. A more visually appealing presentation 
of the data would be useful, with a focus on terms understandable to 
the consumer of the information.
    More data is not necessarily more useful. It would be easy to 
overwhelm the consumer with so much data it is difficult for them to 
discern that which is important. They can easily be ``buried'' in the 
data such that they ``give up'' on finding that which they need. Our 
approach in the A&M System Analytics project (which is not primarily 
targeted to parents/students but rather to Regents/Legislators) is to 
separate out ``Governance'' metrics so that the viewer can go directly 
to those metrics most pertinent to their interests. A similar approach 
may be useful on other projects, where ``tabs'' on the website may 
direct consumers to the information relevant to them, thus cutting 
through data overload.

    4. What kind of manpower is needed to comply with data needs?

    Largely, the reporting is automated requiring relatively little 
manpower. The greatest attention must come from negotiating the 
differences in definitions such that the appropriate data is submitted 
specific to that report. As noted above, if the definitions were 
consistent across reports it might be possible for one report to be 
submitted and all entities pull data from the one database (e.g., some 
sort of omnibus database for multiple uses). This would save time and 
    Some data collection organizations (such as VSA, Achieving the 
Dream & NSSE) have shifted data collection away from survey form entry 
to data flat file uploads, allowing the institution's data collection 
processes to focus on validation of requested reporting variables 
instead of running numerous summary extractions and placing individual 
numbers on multiple survey forms. This process streamlines the data 
collection and reporting processes, with institution-level efforts 
focusing on T-SQL script programming and less on presentation. 
Organizations relying on the flat file submissions, which is what NCES 
used for the recent NPSAS (National Postsecondary Student Aid Study), 
would then upload the flat files for inclusion into a database that 
could provide for the extraction and presentation of survey variables.
    I cannot provide useful feedback regarding the VSA, as VSA reports 
are submitted by the institutions and not the System. IR officers in 
the A&M System report VSA is not particularly time consuming due to the 
automation mentioned above.
    The A&M System Analytics project has been time consuming at the 
System offices. Since approximately April 1, one data analyst has 
devoted appropriately \1/4\ of his time to this project and the ``lead 
senior software developer'' has devoted approximately 75% of his time 
to the project. Six additional individuals (all holding professional 
positions) in the A&M System offices have devoted varying amounts of 
time to the project (e.g., additional programming, web design, 
accounting, administrative). It is important to note that this accounts 
only for time and effort at the central System offices. Each 
institution has contributed data. For most institutions, the contact 
person has been the provost/VPAA and the Institutional Research 
officer. I estimate that each has devoted a measurable but 
insignificant amount of time to the project, as the vast majority of 
data has been pulled from existing reports. This is possible only 
because Texas has a robust reporting system, allowing us to tap into 
existing reports to create our analytics site.
    It is important to note that the above time allocations have 
primarily been invest in transforming existing data into more usable 
forms. If we were creating new data (which we anticipate doing in the 
coming months), the amount of time would/will increase commensurate 
with the project.

     Dr. Schneider's Response to Questions Submitted for the Record

    Following my testimony before you subcommittee on 20 September 
2012, you asked me to address three questions:

    1. If I could eliminate any data collection or reporting 
requirement currently collected by the federal government, which ones 
would I select and why?
    2. What specific components of IPEDS do I think are the most 
useful? Which data elements should be reconsidered or reformed?
    3. What data points should the federal government request from the 
states? What lessons could the federal government learn from the states 
in terms of streamlining its data collections?

    I answer these questions below.
1. Reducing the burden of data collection
    I will answer both of these questions from the perspective of the 
National Center for Education Statistics (NCES), the agency whose data 
collections I have the most direct experience. I will start with IPEDS 
because this is the single largest data collection that postsecondary 
institutions are subject to, and the one that is most often at the 
center of complaint about burden.
    Like IPEDS' antiquarian focus on first-time, full-time beginning 
students when it measures student success in college, other parts of 
IPEDS seem to be the equivalent of an archeological dig revealing 
layers upon layers of measures created by civilizations long gone. Many 
of the data items in IPEDS were created in response to what seemed like 
a good idea years ago or in response to legislation that is in need of 
updating, given the rapidly changing nature of American society and 
America's college.
    A critical step in modernizing IPEDS is to make better use the 
inventory of what is required by law or regulation that appears in The 
History and Origins of Survey Items for the Integrated Postsecondary 
Education Data System http://nces.ed.gov/pubs2012/2012833.pdf
    Taking this step is essential, since many items that campuses now 
report are rooted in legislation and NCES or the Department of 
Education cannot simply stop collecting the data or stop requiring 
campuses to report them. In the upcoming reauthorization of the HEA, 
with a firm sense of the basis for IPEDS items, Congress will be in a 
far better position to clean house.
    The second step is to determine empirically which measures in IPEDS 
are used, by tracking the frequency of which measures are downloaded 
from the IPEDS data center. This empirical evidence could and should 
act as guidance for what can be removed from IPEDS. This is especially 
true if unused items are not rooted in legislation or regulations.
    In terms of low hanging fruit, I believe that most of the human 
resources survey is burdensome (NCES estimates over 25 hours on average 
to complete) and not often used--but this supposition could be verified 
empirically. But even if the survey is not used, as made clear in The 
History and Origins of Survey Items report referenced above, most of 
its items are rooted in legislative requirements and NCES can't simply 
stop administering the survey without Congressional action. Also, soon 
information on academic libraries will begin to be collected by IPEDS. 
The federal interest in collecting this information is at best minimal. 
I suggest that it not be added.
    While some items and perhaps some surveys can be dropped, others 
can be improved. Most notably, IPEDS finance data are among the most 
important data IPEDS collects but they are flawed. Some of this is 
rooted in GASB v. FASB reporting requirements--but also because 
institutions report similar expenditures in different categories, 
making comparability difficult. This is a particular problem given the 
different reporting categories used across the for-profit, not for 
profit and public sectors. As a quick indicator of the problem, 
consider that both public and not-for-profit institutions each report 
their expenses per FTE in seven categories, while for-profit 
institutions use only three reporting categories. Given the growth of 
the for-profit sector, the lack of comparability is a serious problem 
that has been recognized and some progress on this front may soon be 
coming. But given the centrality of these data, we need to monitor this 
    Given existing technology, many institutions impose heavier burdens 
on themselves than necessary. Far too few data reporters take advantage 
of technology options that could help reduce burden. A surprising 
number of data reporters enter the data screen-by-screen rather than in 
any of several automated routes provided by NCES.
    Most institutions of higher education have their data systems 
handled by a small number of providers. These vendors have been slow to 
build modules onto their systems that might assist institutions in more 
easily reporting IPEDS data. However, NCES' current work to link IPEDS 
aggregate data elements back to Common Education Data Standards 
(student-level data elements) should allow vendors to use technology 
more effectively to reduce reporting burden, but only if the vendors 
find a financial incentive to do so.
    Some of these technological fixes will show up shortly, others are 
here already. Nonetheless, some of the burden of IPEDS results from the 
fact that institutions must aggregate their data to fill in specific 
cells rather than report individual level data. I understand the 
concerns about the federal government holding student level data. 
However, there are solutions to this problem--including having states 
de-identify data before sharing with the federal government and new 
statistical techniques that can create ``synthetic data sets'' that 
mimic the original data but are divorced from personally identifiable 
    Approaching any data set that contains sensitive personal 
information presents both benefits and risks to any government holding 
these data. And at the current time, the federal government has decided 
that the risks outweigh the benefits. However, it is important for 
Congress to periodically consider the changing balance between risks 
and benefits. The coming reauthorization of HEA presents such an 
    The benefits are clear--more accurate and more comprehensive 
measures of student success. We need to consider the risks in 
relationship to these benefits--and we need to see how far we can 
ameliorate these risks to reach a point where Congress might feel 
comfortable with something like the Wyden/Rubio/Hunter Know Before You 
Go proposed legislation.
2. Disclosure requirements
    So far, I have discussed reporting requirements, I believe that 
Congress should investigate further how campuses treat disclosure 
requirements. I suspect that colleges and universities are all too 
often treating these informational items casually.
    Systematic evidence of this is found in a report by Kevin Carey and 
Andrew Kelly in November 2011 on institutional conformity with six 
disclosure rules. Entitled The Truth Behind Higher Education Disclosure 
Laws http://www.thecollegesolution.com/wp-content/uploads/2011/11/
HigherEdDisclosure--RELEASE.pdf, Carey and Kelly found that compliance 
was often quite weak--most notably, only around a quarter of the 
schools disclosed the six-year graduation rate for students who receive 
a Pell Grant.
    Another area in which colleges and universities have been less than 
forthright concerns Net Price Calculators, which are supposed to be 
displayed on campus websites.
    I have spent hours on many campus websites looking for these 
calculators--and many of my friends have done the same thing, either 
for professional reasons or because they have children who are applying 
for college. There is agreement that while the letter of the law is 
being followed, the spirit of the law is all too frequently violated. 
On many web sites it can take as many as 10 ``clicks'' to find the Net 
Price Calculator and many of the calculators seem unnecessarily 
    Given how important consumer information is to making our system of 
higher education work, Congress has rightly determined that some 
important information needs to be disclosed by institutions. Congress 
needs to determine how well that information is being conveyed to the 
    The National Postsecondary Education Cooperative issued two reports 
trying to provide guidance to institutions on how to make these 
consumer information disclosures easier to find, but as the second 
report indicates, the colleges still seem to fail to present the info 
in a way that allows it to be easily compared across institutions.\1\ I 
would recommend that institutions be provided a template that outlines 
specifically how the information should be made available on websites 
and be presented consistently across institutions. (NPEC is working on 
such a voluntary disclosure template now).
    \1\ The reports can be found at http://nces.ed.gov/pubsearch/
pubsinfo.asp?pubid=2010831rev and http://nces.ed.gov/pubsearch/
    Finally, Congress could suggest that ED develop a set of 
standardized micro-data tags that could be written into the html of 
institutions websites that would tag each data item and make it easier 
(1) to find via web searches and (2) to be collected and aggregated by 
researchers or other through web scraping technology. This is a far 
less burdensome option than turning all disclosure requirements into 
reporting requirements, which some suggest should be done--and would 
open up the data for creative aggregation and use by researchers and 
private companies.
3. State data systems
    Many states hold student level data and these data could and should 
be used to generate more accurate measures of student success and cover 
far more students than covered by IPEDS. These state data systems can 
also be more effectively used to populate IPEDS reducing the burden on 
campuses. We may need to make it clear legislatively or regulation that 
states can act as delegated agents of institutions to populate IPEDS 
    As Congress considers more complete measures of student success 
both while in school and after graduation, state data systems will play 
a central role. For example, as we move toward providing students with 
more information about the likely earnings outcomes of their college 
degrees, the nation will need to tap into state held unemployment 
insurance data that will allow us to map earnings data (held in state 
UI data systems) onto student level data (held by state student unit 
record systems). However, in work that I am doing with these linked 
data in several states, we have found that only about half of the 
graduates in a student unit record system are matched in the state 
unemployment insurance record system. This match rate is driven by 
several factors, but one the likely largest contributing factor is 
interstate mobility: because state UI data do not cover students who 
graduated within a state but are now working in another state, many 
students are not found. The Congress should thoroughly review progress 
on the Workforce Records Interchange System (WRIS II) to see if this 
could increase coverage. My impression is that WRIS-II is under-used 
and most people working for state higher education agencies have little 
or no knowledge of this federally financed resource.
    State data systems also vary in coverage. States with student unit 
records systems cover public institutions (although sometimes, as in 
California, there may be several systems), some cover not-for-profit 
IHEs, but I don't know of any that cover for-profit colleges and 
    The federal government has invested hundreds of millions of dollars 
to build state longitudinal data systems. It behooves the Congress to 
see that the nation sees a commensurate reward to that investment. One 
area in which the rewards can be realized is through reducing burden on 
institutions by having state data systems do more reporting for more 
students, more institutions, and better measures. The Congress should 
also investigate the extent to which these hundreds of millions of 
federal taxpayer dollars has actually been used to improve the flow of 
consumer information to students and their families to allow them to 
more wisely choose postsecondary institutions that are better doing 
their job of educating students who go on to be productive members of 
our society.
    [Whereupon, at 11:45 a.m., the subcommittee was adjourned.]