top of page
Search

Applied Knowledge and the Art of Winging It

  • Alyssa Jennings
  • Dec 1, 2019
  • 4 min read
When I began my graduate study over two years ago, my foremost aspiration was to learn how to learn – to gain the skills necessary to collect and interpret information in order to develop my own conclusions about the things I cared about.

By that point, I’d already dabbled in a few qualitative methods of data collection and analysis as part of my undergraduate honors thesis; however, my efforts then were comparable to that of someone stumbling in the dark through a room they’d only ever read about. Sure, I had read about conducting primary research in a handful my undergraduate classes, but it was typically described in generalizations and abstractions – e.g. interviews involve one-on-one conversations, focus groups involve interviewing groups of people, you always start with a research question, etc. As a result, when the time came for me to actually design my own research project for my undergraduate honors thesis, I found myself at a loss over how to operationalize all of the conceptual material I had read about how primary research is done.


In retrospect, I now think of my undergraduate honors thesis as an extended exercise in winging it, and while I’m still quite proud of the insights I gleaned from the research I conducted, I also can’t deny that my understanding of how to go about said research was slipshod at best. For this reason, by the time I entered graduate school, I was practically chomping at the bit to finally receive more than common sense descriptions that were only ever intended to provide a general sense of primary research. Put simply, I wanted more, and this desire was accordingly reflected in the first NASPA/ACPA competency I prioritized in the professional development plan I created at the start of my higher education administration graduate program: assessment, evaluation, and research (AER).

The professional development plan I created at the start of my higher education master's program.

Now, two years later, this desire persists, as I’ve seen time and time again that higher education administrators only effect change after first conducting and evaluating assessments. In short, data talks, and I accordingly see this competency as integral to my ability to persuade institutional administrators (i.e. my future superiors) to implement the many ideas and initiatives I may propose as a higher education professional.


Fortunately, my internship experiences at William Peace University’s Center for Student Success have afforded me ample practice in this competency, which has in turn heightened my awareness of how applied knowledge is cultivated within postsecondary institutions, or at the very least WPU. At the start of every academic year, for example, WPU administers the College Student Inventory (hereafter referred to as the CSI) to every incoming first-year student as a way to assess their personal, professional, and academic needs as they transition and acclimate to the culture and expectations of higher education. Students at-risk of dropping out and/or transferring to another institution are likewise identified, and while WPU does not explicitly reference any particular theory in their discussions of the CSI, it’s readily apparent that it corresponds with higher education scholarship such as Tinto’s model of voluntary student departure.

Indeed, the model’s tenet that academic and social integration mutually and reciprocally influence student persistence is particularly applicable to WPU’s view and treatment of the CSI, as much of my internship experience has involved developing and compiling response plans to this year’s CSI data in order to ensure greater retention among first-year students.

More specifically, I’ve spoken with staff from various offices (e.g. the Office of Academic Support, the Career Design Center, the Office of Academic Advising) to help create and clarify ways in which WPU might improve and extend its services and supports in order to mitigate the sorts of transitional challenges incoming students indicated in their completion of the CSI. In this way, WPU staff seem to be developing their own applied knowledge – that is, informal theory based upon the student input and feedback they receive. For example, the Career Design Center sees low levels of participation in support services among first-year students, despite the fact that approximately 50% of incoming students indicated that they were interested in receiving aid identifying a major and career.

A sample College Student Inventory (CSI) report.

Given this discrepancy, the new director of the Career Design Center hypothesized that first-year students were both unaware of the Career Design Center and perhaps did not think that they were yet ready to utilize its services. As a result, one of her intended initiatives is to develop a marketing campaign (e.g. flyers, emails, digital signage, etc.) that highlights the CSI data concerning first-year students’ interest in receiving support to identify a major and career. Preparations for said effort are currently underway, and her hope is that, by publicizing such data alongside information about the Career Design Center, the Center’s services will be normalized as a viable option for students to make use of.


Only time will tell if her “theory” is proven right. For my part, it’s been incredibly worthwhile to take part in reviewing institutional data, especially data that is used for retention purposes, and then see it being used to inform the operation of offices and departments. Indeed, up to this point, I had only ever taken part in research and assessment as part of a class assignment, the expectations for which could not possibly align perfectly with what actually occurs in higher education practice. My experiences interning at WPU have thus operationalized what research and assessment actually look like in the field, where student affairs practitioners’ responsibilities span not only research and assessment but also the day-to-day operation of their functional areas. Perhaps most importantly, however, I’ve realized that you don’t have to be an expert on research and assessment methods in order to conduct, interpret, and use research and assessment in higher education practice. Several staff members with whom I spoke expressed no small degree of confusion over interpreting the massive dataset, for instance, and my supervisor, WPU’s Director of Retention and First Year Experience, consulted with a representative of the CSI in order to navigate how to read, understand, and evaluate the data captured through the assessment instrument. If anything, therefore, it would seem that higher education professionals are “winging it,” at least to some extent, just as I was as an undergraduate.


 
 
 

Comments


bottom of page