0

Social Media and Student Identity Development

Posted by reyjunco on July 23, 2014 in Research |

Social Media and Student Identity Development Rey Junco Chapter It’s here! The first (free) chapter from my upcoming book, Engaging Students through Social Media: Evidence-Based Practices for Use in Student Affairs. The chapter covers how youth’s interactions online help them develop their identity, or a stable sense of self. Download the chapter here. The Kindle version of the entire book is now available on Amazon.com, hardcover will be released on August 18th. I look forward to hearing your thoughts about this chapter (which happens to be my favorite) and the rest of the book. 

 

Tags: , , , , , , , , ,

0

Using Social Media in Student Affairs: An Evidence-Based Approach #ACPA14 Slides

Posted by reyjunco on April 3, 2014 in Presentations |

Here are the slides from the talk I gave this past week at the #ACPA14 conference in Indianapolis. The talk is loosely based on a chapter from my upcoming book Engaging Students through Social Media: Evidence Based Practices for Use in Student Affairs

Tags: , , , , , , , , ,

1

Heading to Iowa

Posted by reyjunco on April 1, 2014 in Commentary |

Iowa State Twitter Bird - Rey JuncoI’m incredibly excited to announce that I’ve accepted a position as an associate professor in the School of Education at Iowa State University starting this summer. I’ll be teaching and advising students in the Student Affairs graduate program.

Many of you who follow this blog know that I started a new position in the Purdue University Libraries this past year. I have enjoyed my time at Purdue– the Libraries faculty are a dynamic and interesting bunch. My explorations of information literacy from an information science perspective will forever influence my research. Not to mention that Purdue is a great institution (and a really cool college town).

I have often heard from student affairs professionals who want to begin a Ph.D. to focus on social media/emerging technologies but have hesitated because no programs focus on such issues. Now is your chance to come work with me! Not only will I be continuing my research on how new technologies influence student development, but I’m joining an already impressive and vibrant community of scholars at Iowa State. We’ve also got two new assistant professors joining us in the fall whose research focuses greatly on social justice issues.

Drop me a line to learn more or find me at #ACPA14

Tags: , , , , ,

3

Texbook analytics: A new way to do learning analytics

Posted by reyjunco on October 17, 2013 in Research |

Today at the EDUCAUSE 2013 conference in Anaheim, I unveiled new research on textbook analytics. Textbook analytics are an emerging subcategory of learning analytics, which is the use of student-generated data to predict learning. These predictive analytics promise the ability to identify at-risk students and to help faculty adjust their teaching in real-time.

However, up to now, learning analytics projects have collected limited data. Typical learning analytics systems are tied to Learning and Course Management Systems (LCMSs) and collect data like number of logins, number of discussion posts, etc. Additionally, research showing the predictive ability of learning analytics is limited because these studies relate grade-earning activity with course grades. In other words, students earn grades for discussion posts on LCMSs, so of course number of posts would be related to student course grades.

Textbook analytics provide information on how much students are reading and how they are engaging with their digital textbooks. CourseSmart has developed a textbook analytics platform that unobtrusively calculates an Engagement Index based on how students are interacting with their textbook.

CourseSmart provided data on 233 students including their Engagement Index scores, their background characteristics, and their final course grades. Using a blocked linear regression controlling for gender, race/ethnicity, and prior academic achievement (student transfer GPA), I found that the Engagement Index was significantly predictive of final course grades. In fact, the Engagement Index was a stronger predictor of final course grades than prior academic achievement (see figure below), which has been shown in previous research to be the strongest single predictor of student success.

CourseSmart Analytics Research Paper Figure 1 Regression

What was especially interesting was that highlighting was related to student course outcomes, although not in the way that you might think. Those students who were in the top 10th percentile of number of highlights had significantly lower course grades than students in the lower 90th percentile. This is congruent with previous survey research showing that low-skill readers highlight more text and more often than high-skill readers. These results show that perhaps these types of analytics can identify students who need help with their reading skills.

Textbook analytics open up possibilities for real-time and unobtrusive formative assessment for faculty. With a single index, instructors can gauge how much and how well students are engaging with their textbooks, identifying at-risk students before they ever turn in a gradable assignment or interact on the LCMS. Plus, textbook analytics open up possibilities for new methods to research student reading and its relationship to student outcomes.

You can read more about this in the CourseSmart research report : Evaluating How the CourseSmart Engagement Index Predicts Student Course Outcomes

Tags: , , , , , , , , , ,

9

Student affairs professionals: Looking for social media examples for my book

Posted by reyjunco on July 11, 2013 in Commentary, Research |

Student affairs and social mediaStudent affairs professionals: I need your help for my next book, Engaging Students through Social Media: An Evidence-Based Approach for Student Affairs being published by Wiley/Jossey-Bass.

I’m looking for examples of how you are using social media in your functional areas. Successes and challenges are both welcome! Feel free to post your story in the comments section or send it privately to me via email by clicking here. Please indicate whether you would like to be identified or whether you would like for your contribution to remain anonymous.

Here are some questions to help frame what I’m looking for (note that I’m not looking for you to answer every question – they are just food for thought):

  1. What did you do? Which social media tool did you use? How did you use it? How did you get students to use it with you? How did you overcome departmental/division/institutional resistance, if any?
  2. What worked? How did students respond to the intervention? What did you do (if anything) to measure what worked?
  3. What didn’t work? What were the challenges you faced? Were there challenges you didn’t expect?
  4. What were the major takeaways?
  5. What advice would you give to others?

If I use your example and you choose to be identified, you’ll get credit in the chapter where the example appears and I’ll also list you in the acknowledgements.

Thanks!

Tags: , , , , , , , ,

0

Berkman Center Fellowship

Posted by reyjunco on July 8, 2013 in Commentary, Research |

I am very happy to announce that I’ll be returning to the Berkman Center for Internet and Society this next academic year as a fellow. I am both honored and incredibly excited for this opportunity to continue to engage and collaborate with the Berkman community, especially the Youth and Media team. I had such a blast this last year collaborating with Youth and Media on projects focusing on youth privacy, memes, and evaluating youth technology use in developing nations. I also examined and/or had engaging discussions about the right to be forgotten with Meg Ambrose, textbook piracy with Bodó Balázs, engaging students in novel ways with Eric Gordon, open access with Peter Suber, the future of education with Justin Reich, hacker culture with Molly Sauter, information quality with Alison Head, privacy tools with Ryan Budish, and had a really fun time leading members of the Berkman community in statistics and methodology sessions. And that list doesn’t even cover the many fascinating lunchtime talks, fellows hours, and digital identity working group sessions.

Berkman Center LogoThis coming year, I’ll be finishing my book Engaging Students through Social Media: An Evidence-Based Approach for Student Affairs being published by Wiley/Jossey-Bass. I’ll also be engaged in a number of fun analytics projects. I’m currently working on a paper analyzing how textbook analytics can help predict student success. I’ll also be using a dataset containing comprehensive data on student technology use in order to build predictive models of student success and academic resource utilization– it’s an extension of current learning analytics models to include more predictors and therefore to improve predictive ability of these models. Put another way, I’ll be using big data to improve learning analytics.

Tags: , , , , , , , , , ,

11

New position at Purdue University

Posted by reyjunco on February 11, 2013 in Commentary |

Purdue UniversityI am delighted to announce that later this semester, I will be joining the faculty of the Purdue University Libraries as an associate professor. At Purdue, I will focus on emerging technologies in education with a special focus on the first year experience. If you don’t already know about the great work happening in educational technologies at Purdue, I’d recommend checking out their ITaP studio where they’ve developed learning tools that include: an app that integrates with Facebook to increase student engagement, a learning analytics platform, and a badging system. I’ll also continue my current line of research and look forward to the expansive new lines of inquiry I’ll pursue in collaboration with my new Purdue colleagues.

Image credit: martinliao http://www.flickr.com/photos/martinliao/7209746342/

Tags: , , , , ,

11

Comparing actual and self-reported measures of Facebook use

Posted by reyjunco on January 18, 2013 in Research, Survey Design |

If you are a regular reader of this blog, you likely already know that there is a growing body of research that examines how college students use Facebook and the outcomes of such use. For instance, researchers have examined how Facebook use is related to various aspects of the college student experience including learning, student engagement, multitasking, political activity, life satisfaction, social trust, civic engagement, and political participation, development of identity and peer relationships, and relationship building and maintenance.

All of the previous research has relied on self-reported measures of Facebook use (that is, survey questions). We know from research in other areas of human behavior that there are significant differences between actual and self reported behaviors. One of my favorite examples is a study where researchers found that up to 50% of self-reported non-smoking head and neck cancer patients were indeed smoking as measured by exhaled carbon monoxide levels and levels of a nicotine metabolite in their blood.

As you might imagine, differences between self-reported and actual uses of Facebook could drastically change or even negate findings of how Facebook use is related to the aforementioned outcomes. My latest paper published in Computers in Human Behavior, Comparing actual and self-reported measures of Facebook use examines these differences.

Here is what I did: I paid students to allow me to install a monitor on their computers for one month. I also surveyed them to ask them how much time they spent on Facebook and how many times they logged in to the site. I also monitored/asked about other forms of tech/social media use (like Twitter and email).

Here is what I found: As you can see in the scatterplot below, there was a significant positive correlation between self-reported and actual Facebook use (Pearson’s r = .587, p < .001).

Scatterplot of correlation between self-reported and actual Facebook use

However, and here is the really interesting part, students significantly overestimated the amount of time they spent on Facebook. They reported spending an average of 149 minutes per day on Facebook which was significantly higher than the 26 minutes per day they actually spent on the site (t(41) = 8.068, p < .001).

What is going on? In the paper, I go into much more detail about why there is such a large and significant difference between actual and self-reported Facebook use, as well as why the two are significantly correlated. In brief:

  1. It could be that self-report questions aren’t specific enough to capture frequency of Facebook usage. Students may interpret a question asking “how much time do you spend on Facebook each day?” as meaning “how much time are you logged in?” or as a post-hoc focus group suggested “how much time do you spend thinking about Facebook?”
  2. Students may have implicit theories about how they use technology that are likely based on messages received by the media and adults (i.e., “youth use Facebook a lot!”). This would lead them to give inflated estimates of the actual amount of time they spend on the site.
  3. Students may have accessed Facebook from multiple devices. This is certainly an important consideration, especially given the popularity of mobile Facebook use. However, I conducted analyses to try to explain the unaccounted-for self-reported time with mobile use and found that there was no way to explain the large difference between actual and self-reported time. Students could have used Facebook from other computers that were not being monitored; however, that is unlikely given their overall computer use. This is definitely an important facet to study in future research.
  4. Students (and people in general) might not be able to estimate Facebook use. This could very well be a function of how we’ve developed schemas about technology use. For instance, drivers often estimate driving distances in miles and time to destination; however, Internet users typically do not estimate frequency and intensity of use in time.

Limitations

Students who allowed their computer use to be monitored might have very different online behaviors than those who didn’t (although there were no differences between students who chose to install the monitoring software and those who didn’t on demographic variables and on all but one of the variables of interest). Another limitation is the possibility of observer effects or that the students behaved differently because they knew they were being monitored. Students may have used other devices, such as their cell phones, to access Facebook which would have provided a lower estimate of actual use.

Implications

It’s clear that self-reported measures of Facebook use can approximate but are not accurate measures of actual use. The inconsistency between self-reported and actual measurements will obfuscate how Facebook use is related to outcomes, signaling a problem for research relying on self-report measures. However, all is not lost. Self-reported measures can give us a good approximation of frequency of use and hopefully, future extensions of this research can come up with more accurate self-report measures or with a “correction factor” for self-reported use.

Tags: , , , , , ,

0

Mobile apps and youth privacy

Posted by reyjunco on December 13, 2012 in Commentary |

Mobile apps and youth privacyOn Monday, the Federal Trade Commission (FTC) published Mobile Apps for Kids in which they reported the results of their recent survey of how well mobile apps for kids conform to Children’s Online Privacy Protection Act (COPPA) requirements.

The results were alarming: 59% of apps transmitted the mobile device ID (which includes among other things the app name, the app version number, the developer, a time stamp, the operating system, and the device model), 3% of apps shared geolocation, and 1% shared a phone number. 56% of these apps transmitted this information to ad networks, analytics companies, or other third parties. However, only 20% of the apps disclosed information about their data collection practices. Put another way, 80% of the apps are in violation of both the letter and spirit of COPPA (which requires that websites and/or online services that collect information from children must : 1. Provide notice of what types of information is being collected, how it is being used, and disclosure practices and 2. Obtain verifiable parental consent in order to collect, use, or disclose children’s data).

Why is this a big deal?

The information collected from these apps could be used to find or contact children because they collect geolocation and phone numbers. Remember the uproar when the iPhone was secretly tracking and storing your every move? This, I would posit, is even worse. These apps are tracking children’s activities across different apps without their parents’ knowledge or consent. The information collected was often transmitted to advertising networks with no disclosures as to why the advertising networks needed it or how they would use it. Such tracking builds profiles of children (their likes, dislikes, browsing habits, etc.) for insidious forms of marketing. This is analogous to the tracking and advertising that happens on the web – of which most adults are unaware. Through such tracking, advertisers can build very accurate profiles of children to “push” advertising—it’s a generally subconscious and powerful form of tracking and marketing and one that we should be protecting children from until they have the cognitive capabilities to resist such influences.

I don’t know about you but I don’t trust tracking and ad agencies and undisclosed third parties.

What can we do about it?

If you are a parent, you can’t do much about it. Remember that 80% of the apps provided no disclosure about the fact they were collecting data so it’s not like you can discriminate between apps that send this information and those that don’t (unless of course, an app explicitly states that they don’t send this information).

We need to put pressure on app developers to provide appropriate disclosures. Reuters reported that the “Association for Competitive Technology, which represents more than 5,000 small and medium-sized app developers, said developers were often unsophisticated about legal obligations but that the group held workshops and boot camps to train them in best practices.” Ok sure, they may be unsophisticated about legal obligations; however, this statement suggests that developers seem to have little concern about the ethics of collecting and sharing data from minors.

A coalition that includes the Application Developers Alliance, the ACLU, and the World Privacy Forum has been working on standardizing a short-form notice for app privacy disclosures. Of course, the advertisers aren’t too keen on this and are trying to come up with their own self-enforcement policies.

Lastly, we need to support the FTC in expanding their enforcement of COPPA to include geolocation and personal identifiers such as device IDs. Many have argued that COPPA is outdated and this is yet another instance that emphasizes this point.

Image credit: ohmeaghan http://www.flickr.com/photos/ohmeaghan/6014480823

Tags: , , , ,

6

Don’t Facebook & text during class, email instead

Posted by reyjunco on November 27, 2012 in Research |

Facebook and texting related to poorer gradesMy most recent paper on multitasking, In-class multitasking and academic performance, has uncovered some interesting results. I conducted a survey of 1,839 college students and asked them how often they multitask during class by using Facebook, texting, emailing, searching for content not related to the class, IMing, and talking on the phone. I also collected students’ actual overall GPAs for the semester in which the study was conducted. In this post, I’ll only focus on the high frequency and moderate frequency activities:

Texting was a high frequency activity: 69% of students reported texting during class.

Using Facebook, searching for content not related to the class, and emailing were moderate frequency activities: 28% of students said they used Facebook during class, 28% of students said they used email during class, and 21% of students said they searched for content during class.

Here’s where it gets interesting: Using Facebook and texting during class were significantly negatively related to overall semester GPA after controlling for gender, race/ethnicity, and Internet skill. However, emailing and searching during class were not related to GPA.

While incongruent with the literature on multitasking in the field of cognitive science, the results are congruent with recent research finding comparable results. In a similar study, Shelia Cotten and I found that using Facebook and texting while doing schoolwork were negatively associated with overall college GPA while emailing, searching, talking on the phone, and instant messaging were not. Furthermore, an experimental study by Wood et al. (2012) found that students who used Facebook while attending to a lecture scored significantly lower on tests of lecture material than those who were only allowed to take notes using paper and pencil; however, the scores of students who texted, emailed or sent IMs did not differ significantly from students in control groups.

What is going on?

While further research is warranted, I’ve got a few hypotheses: First, there may be something about the technologies themselves that leads to poorer outcomes. Second, it is possible that the discrepancies in outcomes may lie in the nature of how the technologies are used and the frequency with which they are employed. For instance, Rosen et al. (2011) found that students who sent and received the most number of text messages while watching a lecture video scored lower on a test of the lecture material than those who sent the least number of messages; however, there was no difference in scores between the group of students who sent the middle amount of messages and the other groups. My final hypothesis to explain the discrepancy between Facebook and texting and the other technologies is related to the activities students engage in while using each. For instance, my research has shown that how Facebook is used is a better predictor of academic outcomes than how much time is spent on the site.

Limitations

The standard correlation vs. causation limitations apply: this is a cross-sectional and correlational design and more research is certainly warranted. While the sample on which this research was based was representative of the overall university population, it may not be representative of all institutions in the United States. The fact that participants were recruited via email and that the survey was administered online could have biased the sample towards students who regularly use email (and perhaps who multitask more). A final limitation was that the frequency with which students multitask during class was assessed via self report.

Image credit: anna-b http://www.flickr.com/photos/anna-b/3218868484/

Tags: , , , , , , , , ,

Creative Commons License
Unless otherwise specified, all content on this blog is licensed under a
Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

This site is using the Junco Child-Theme, v2.0.2, on top of
the Parent-Theme Desk Mess Mirrored, v2.0.4, from BuyNowShop.com.