0

Recruiting Ph.D. students – Come work with me!

Posted by reyjunco on November 5, 2014 in Research |
Computer Chip Junco Social Media Lab

CC License: http://photos.jdhancock.com/photo/2013-07-01-010715-computer-chip.html

I’m looking for Ph.D. students who want to come work with me at Iowa State. If you are interested in social technologies and how they impact youth, please apply! This would be a funded position and you would work in my emerging research group that will be composed of Ph.D. and Masters students (and eventually advanced undergraduate students). I’m looking for students who are passionate about this research area. The ideal student will be creative in thinking about new research studies they could (eventually) run themselves and/or in thinking about how my existing data can be analyzed. Bonus points for coding and/or statistics skills.

I’ve got a number of projects going right now:

1. Apps and Educational Success. This is a grant-funded project being conducted in collaboration with my colleagues at University of Michigan. We are evaluating a number of apps designed to help middle and high schoolers get to college as well as apps developed to support students already in college.

2. Big Data/Predictive Analytics. This is a large-scale project where over 400 students allowed me to monitor everything they did on their computers for a month. I also have personality data on the students, survey data, as well as institutional data. One of the goals is to discover predictive models that can help identify students at risk by only using trace data.

3. A project on online safety that’s currently in the works.

4. A project on digital technologies to improve self-regulation skills that’s also currently in the works.

You can either apply as a Ph.D. student in Higher Education in the School of Education [Application link - Deadline December 1, 2014]. Or you can apply as a Ph.D. student in Human Computer Interaction [Follow The PhD Graduate Program Application Process steps at the bottom of the page - Deadline January 15, 2015]. Let me know if you have any questions!

Tags: , , , , , ,

1

Looking for a middle school to collaborate on research study

Posted by reyjunco on September 10, 2014 in Research |
Middle School Research

CC License https://www.flickr.com/photos/cronncc/

Friends and colleagues – Our research team (headed up by me and Nicole Ellison) is engaged in a cool Gates Foundation-funded project to help support college readiness for students who otherwise wouldn’t go to college. This same Gates program funded the development of 19 apps/services/websites to more efficiently help students navigate the college application and transition process (as well as succeed in college when they are there).

We need your help

We are looking for a middle school that would be interested in collaborating on one of our projects. We will be testing a game app to see how well it helps middle schoolers learn about the college-going process. Students would use the app for about 6 weeks and we’d evaluate their college-going knowledge before and after the app intervention. This is a great way to supplement existing middle school programs, plus you’d be helping us learn more about what works with educational apps.

If you are an administrator at a middle school and are interested, please email me directly by clicking here. If you know of someone who might be interested, please forward this along to her or him.

Tags: , , , , , ,

0

Social Media and Student Identity Development

Posted by reyjunco on July 23, 2014 in Research |

Social Media and Student Identity Development Rey Junco Chapter It’s here! The first (free) chapter from my upcoming book, Engaging Students through Social Media: Evidence-Based Practices for Use in Student Affairs. The chapter covers how youth’s interactions online help them develop their identity, or a stable sense of self. Download the chapter here. The Kindle version of the entire book is now available on Amazon.com, hardcover will be released on August 18th. I look forward to hearing your thoughts about this chapter (which happens to be my favorite) and the rest of the book. 

 

Tags: , , , , , , , , ,

0

Using Social Media in Student Affairs: An Evidence-Based Approach #ACPA14 Slides

Posted by reyjunco on April 3, 2014 in Presentations |

Here are the slides from the talk I gave this past week at the #ACPA14 conference in Indianapolis. The talk is loosely based on a chapter from my upcoming book Engaging Students through Social Media: Evidence Based Practices for Use in Student Affairs

Tags: , , , , , , , , ,

1

Heading to Iowa

Posted by reyjunco on April 1, 2014 in Commentary |

Iowa State Twitter Bird - Rey JuncoI’m incredibly excited to announce that I’ve accepted a position as an associate professor in the School of Education at Iowa State University starting this summer. I’ll be teaching and advising students in the Student Affairs graduate program.

Many of you who follow this blog know that I started a new position in the Purdue University Libraries this past year. I have enjoyed my time at Purdue– the Libraries faculty are a dynamic and interesting bunch. My explorations of information literacy from an information science perspective will forever influence my research. Not to mention that Purdue is a great institution (and a really cool college town).

I have often heard from student affairs professionals who want to begin a Ph.D. to focus on social media/emerging technologies but have hesitated because no programs focus on such issues. Now is your chance to come work with me! Not only will I be continuing my research on how new technologies influence student development, but I’m joining an already impressive and vibrant community of scholars at Iowa State. We’ve also got two new assistant professors joining us in the fall whose research focuses greatly on social justice issues.

Drop me a line to learn more or find me at #ACPA14

Tags: , , , , ,

3

Texbook analytics: A new way to do learning analytics

Posted by reyjunco on October 17, 2013 in Research |

Today at the EDUCAUSE 2013 conference in Anaheim, I unveiled new research on textbook analytics. Textbook analytics are an emerging subcategory of learning analytics, which is the use of student-generated data to predict learning. These predictive analytics promise the ability to identify at-risk students and to help faculty adjust their teaching in real-time.

However, up to now, learning analytics projects have collected limited data. Typical learning analytics systems are tied to Learning and Course Management Systems (LCMSs) and collect data like number of logins, number of discussion posts, etc. Additionally, research showing the predictive ability of learning analytics is limited because these studies relate grade-earning activity with course grades. In other words, students earn grades for discussion posts on LCMSs, so of course number of posts would be related to student course grades.

Textbook analytics provide information on how much students are reading and how they are engaging with their digital textbooks. CourseSmart has developed a textbook analytics platform that unobtrusively calculates an Engagement Index based on how students are interacting with their textbook.

CourseSmart provided data on 233 students including their Engagement Index scores, their background characteristics, and their final course grades. Using a blocked linear regression controlling for gender, race/ethnicity, and prior academic achievement (student transfer GPA), I found that the Engagement Index was significantly predictive of final course grades. In fact, the Engagement Index was a stronger predictor of final course grades than prior academic achievement (see figure below), which has been shown in previous research to be the strongest single predictor of student success.

CourseSmart Analytics Research Paper Figure 1 Regression

What was especially interesting was that highlighting was related to student course outcomes, although not in the way that you might think. Those students who were in the top 10th percentile of number of highlights had significantly lower course grades than students in the lower 90th percentile. This is congruent with previous survey research showing that low-skill readers highlight more text and more often than high-skill readers. These results show that perhaps these types of analytics can identify students who need help with their reading skills.

Textbook analytics open up possibilities for real-time and unobtrusive formative assessment for faculty. With a single index, instructors can gauge how much and how well students are engaging with their textbooks, identifying at-risk students before they ever turn in a gradable assignment or interact on the LCMS. Plus, textbook analytics open up possibilities for new methods to research student reading and its relationship to student outcomes.

You can read more about this in the CourseSmart research report : Evaluating How the CourseSmart Engagement Index Predicts Student Course Outcomes

Tags: , , , , , , , , , ,

9

Student affairs professionals: Looking for social media examples for my book

Posted by reyjunco on July 11, 2013 in Commentary, Research |

Student affairs and social mediaStudent affairs professionals: I need your help for my next book, Engaging Students through Social Media: An Evidence-Based Approach for Student Affairs being published by Wiley/Jossey-Bass.

I’m looking for examples of how you are using social media in your functional areas. Successes and challenges are both welcome! Feel free to post your story in the comments section or send it privately to me via email by clicking here. Please indicate whether you would like to be identified or whether you would like for your contribution to remain anonymous.

Here are some questions to help frame what I’m looking for (note that I’m not looking for you to answer every question – they are just food for thought):

  1. What did you do? Which social media tool did you use? How did you use it? How did you get students to use it with you? How did you overcome departmental/division/institutional resistance, if any?
  2. What worked? How did students respond to the intervention? What did you do (if anything) to measure what worked?
  3. What didn’t work? What were the challenges you faced? Were there challenges you didn’t expect?
  4. What were the major takeaways?
  5. What advice would you give to others?

If I use your example and you choose to be identified, you’ll get credit in the chapter where the example appears and I’ll also list you in the acknowledgements.

Thanks!

Tags: , , , , , , , ,

0

Berkman Center Fellowship

Posted by reyjunco on July 8, 2013 in Commentary, Research |

I am very happy to announce that I’ll be returning to the Berkman Center for Internet and Society this next academic year as a fellow. I am both honored and incredibly excited for this opportunity to continue to engage and collaborate with the Berkman community, especially the Youth and Media team. I had such a blast this last year collaborating with Youth and Media on projects focusing on youth privacy, memes, and evaluating youth technology use in developing nations. I also examined and/or had engaging discussions about the right to be forgotten with Meg Ambrose, textbook piracy with Bodó Balázs, engaging students in novel ways with Eric Gordon, open access with Peter Suber, the future of education with Justin Reich, hacker culture with Molly Sauter, information quality with Alison Head, privacy tools with Ryan Budish, and had a really fun time leading members of the Berkman community in statistics and methodology sessions. And that list doesn’t even cover the many fascinating lunchtime talks, fellows hours, and digital identity working group sessions.

Berkman Center LogoThis coming year, I’ll be finishing my book Engaging Students through Social Media: An Evidence-Based Approach for Student Affairs being published by Wiley/Jossey-Bass. I’ll also be engaged in a number of fun analytics projects. I’m currently working on a paper analyzing how textbook analytics can help predict student success. I’ll also be using a dataset containing comprehensive data on student technology use in order to build predictive models of student success and academic resource utilization– it’s an extension of current learning analytics models to include more predictors and therefore to improve predictive ability of these models. Put another way, I’ll be using big data to improve learning analytics.

Tags: , , , , , , , , , ,

11

New position at Purdue University

Posted by reyjunco on February 11, 2013 in Commentary |

Purdue UniversityI am delighted to announce that later this semester, I will be joining the faculty of the Purdue University Libraries as an associate professor. At Purdue, I will focus on emerging technologies in education with a special focus on the first year experience. If you don’t already know about the great work happening in educational technologies at Purdue, I’d recommend checking out their ITaP studio where they’ve developed learning tools that include: an app that integrates with Facebook to increase student engagement, a learning analytics platform, and a badging system. I’ll also continue my current line of research and look forward to the expansive new lines of inquiry I’ll pursue in collaboration with my new Purdue colleagues.

Image credit: martinliao http://www.flickr.com/photos/martinliao/7209746342/

Tags: , , , , ,

11

Comparing actual and self-reported measures of Facebook use

Posted by reyjunco on January 18, 2013 in Research, Survey Design |

If you are a regular reader of this blog, you likely already know that there is a growing body of research that examines how college students use Facebook and the outcomes of such use. For instance, researchers have examined how Facebook use is related to various aspects of the college student experience including learning, student engagement, multitasking, political activity, life satisfaction, social trust, civic engagement, and political participation, development of identity and peer relationships, and relationship building and maintenance.

All of the previous research has relied on self-reported measures of Facebook use (that is, survey questions). We know from research in other areas of human behavior that there are significant differences between actual and self reported behaviors. One of my favorite examples is a study where researchers found that up to 50% of self-reported non-smoking head and neck cancer patients were indeed smoking as measured by exhaled carbon monoxide levels and levels of a nicotine metabolite in their blood.

As you might imagine, differences between self-reported and actual uses of Facebook could drastically change or even negate findings of how Facebook use is related to the aforementioned outcomes. My latest paper published in Computers in Human Behavior, Comparing actual and self-reported measures of Facebook use examines these differences.

Here is what I did: I paid students to allow me to install a monitor on their computers for one month. I also surveyed them to ask them how much time they spent on Facebook and how many times they logged in to the site. I also monitored/asked about other forms of tech/social media use (like Twitter and email).

Here is what I found: As you can see in the scatterplot below, there was a significant positive correlation between self-reported and actual Facebook use (Pearson’s r = .587, p < .001).

Scatterplot of correlation between self-reported and actual Facebook use

However, and here is the really interesting part, students significantly overestimated the amount of time they spent on Facebook. They reported spending an average of 149 minutes per day on Facebook which was significantly higher than the 26 minutes per day they actually spent on the site (t(41) = 8.068, p < .001).

What is going on? In the paper, I go into much more detail about why there is such a large and significant difference between actual and self-reported Facebook use, as well as why the two are significantly correlated. In brief:

  1. It could be that self-report questions aren’t specific enough to capture frequency of Facebook usage. Students may interpret a question asking “how much time do you spend on Facebook each day?” as meaning “how much time are you logged in?” or as a post-hoc focus group suggested “how much time do you spend thinking about Facebook?”
  2. Students may have implicit theories about how they use technology that are likely based on messages received by the media and adults (i.e., “youth use Facebook a lot!”). This would lead them to give inflated estimates of the actual amount of time they spend on the site.
  3. Students may have accessed Facebook from multiple devices. This is certainly an important consideration, especially given the popularity of mobile Facebook use. However, I conducted analyses to try to explain the unaccounted-for self-reported time with mobile use and found that there was no way to explain the large difference between actual and self-reported time. Students could have used Facebook from other computers that were not being monitored; however, that is unlikely given their overall computer use. This is definitely an important facet to study in future research.
  4. Students (and people in general) might not be able to estimate Facebook use. This could very well be a function of how we’ve developed schemas about technology use. For instance, drivers often estimate driving distances in miles and time to destination; however, Internet users typically do not estimate frequency and intensity of use in time.

Limitations

Students who allowed their computer use to be monitored might have very different online behaviors than those who didn’t (although there were no differences between students who chose to install the monitoring software and those who didn’t on demographic variables and on all but one of the variables of interest). Another limitation is the possibility of observer effects or that the students behaved differently because they knew they were being monitored. Students may have used other devices, such as their cell phones, to access Facebook which would have provided a lower estimate of actual use.

Implications

It’s clear that self-reported measures of Facebook use can approximate but are not accurate measures of actual use. The inconsistency between self-reported and actual measurements will obfuscate how Facebook use is related to outcomes, signaling a problem for research relying on self-report measures. However, all is not lost. Self-reported measures can give us a good approximation of frequency of use and hopefully, future extensions of this research can come up with more accurate self-report measures or with a “correction factor” for self-reported use.

Tags: , , , , , ,

Creative Commons License
Unless otherwise specified, all content on this blog is licensed under a
Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

This site is using the Junco Child-Theme, v2.0.2, on top of
the Parent-Theme Desk Mess Mirrored, v2.0.4, from BuyNowShop.com.