Comparing actual and self-reported measures of Facebook use

Posted by reyjunco on January 18, 2013 in Research, Survey Design |

If you are a regular reader of this blog, you likely already know that there is a growing body of research that examines how college students use Facebook and the outcomes of such use. For instance, researchers have examined how Facebook use is related to various aspects of the college student experience including learning, student engagement, multitasking, political activity, life satisfaction, social trust, civic engagement, and political participation, development of identity and peer relationships, and relationship building and maintenance.

All of the previous research has relied on self-reported measures of Facebook use (that is, survey questions). We know from research in other areas of human behavior that there are significant differences between actual and self reported behaviors. One of my favorite examples is a study where researchers found that up to 50% of self-reported non-smoking head and neck cancer patients were indeed smoking as measured by exhaled carbon monoxide levels and levels of a nicotine metabolite in their blood.

As you might imagine, differences between self-reported and actual uses of Facebook could drastically change or even negate findings of how Facebook use is related to the aforementioned outcomes. My latest paper published in Computers in Human Behavior, Comparing actual and self-reported measures of Facebook use examines these differences.

Here is what I did: I paid students to allow me to install a monitor on their computers for one month. I also surveyed them to ask them how much time they spent on Facebook and how many times they logged in to the site. I also monitored/asked about other forms of tech/social media use (like Twitter and email).

Here is what I found: As you can see in the scatterplot below, there was a significant positive correlation between self-reported and actual Facebook use (Pearson’s r = .587, p < .001).

Scatterplot of correlation between self-reported and actual Facebook use

However, and here is the really interesting part, students significantly overestimated the amount of time they spent on Facebook. They reported spending an average of 149 minutes per day on Facebook which was significantly higher than the 26 minutes per day they actually spent on the site (t(41) = 8.068, p < .001).

What is going on? In the paper, I go into much more detail about why there is such a large and significant difference between actual and self-reported Facebook use, as well as why the two are significantly correlated. In brief:

  1. It could be that self-report questions aren’t specific enough to capture frequency of Facebook usage. Students may interpret a question asking “how much time do you spend on Facebook each day?” as meaning “how much time are you logged in?” or as a post-hoc focus group suggested “how much time do you spend thinking about Facebook?”
  2. Students may have implicit theories about how they use technology that are likely based on messages received by the media and adults (i.e., “youth use Facebook a lot!”). This would lead them to give inflated estimates of the actual amount of time they spend on the site.
  3. Students may have accessed Facebook from multiple devices. This is certainly an important consideration, especially given the popularity of mobile Facebook use. However, I conducted analyses to try to explain the unaccounted-for self-reported time with mobile use and found that there was no way to explain the large difference between actual and self-reported time. Students could have used Facebook from other computers that were not being monitored; however, that is unlikely given their overall computer use. This is definitely an important facet to study in future research.
  4. Students (and people in general) might not be able to estimate Facebook use. This could very well be a function of how we’ve developed schemas about technology use. For instance, drivers often estimate driving distances in miles and time to destination; however, Internet users typically do not estimate frequency and intensity of use in time.


Students who allowed their computer use to be monitored might have very different online behaviors than those who didn’t (although there were no differences between students who chose to install the monitoring software and those who didn’t on demographic variables and on all but one of the variables of interest). Another limitation is the possibility of observer effects or that the students behaved differently because they knew they were being monitored. Students may have used other devices, such as their cell phones, to access Facebook which would have provided a lower estimate of actual use.


It’s clear that self-reported measures of Facebook use can approximate but are not accurate measures of actual use. The inconsistency between self-reported and actual measurements will obfuscate how Facebook use is related to outcomes, signaling a problem for research relying on self-report measures. However, all is not lost. Self-reported measures can give us a good approximation of frequency of use and hopefully, future extensions of this research can come up with more accurate self-report measures or with a “correction factor” for self-reported use.

Tags: , , , , , ,

  • This is AWESOME!  Such great fundamental work that needed to be done.  Thank you.

  • Thanks Paul! I’ve been wanting to do this for a while, but only recently had the resources to do so (thanks in part to Doc Searls’ help finding the monitoring app!).

  • This is necessary work and I’m really glad that you found the resources to do it!  That you found that students overestimate their Fb usage is quite interesting and I hope there are details in your paper that describe precisely how you – using your monitoring tool – quantified use of Fb.

    Some of my colleagues and I are conducting a small study and we are asking students some relatively basic and broad background questions about their Internet use.  I think I’m pretty savvy in this area and even I was surprised how much we had to adjust those questions in response to the information we collected during cognitive interviews!  We came across two issues that seem to complicate this: background, passive use of the Internet (e.g., streaming music, being logged in but not actively using a website) and mobile usage which often occurs in small spurts making it difficult for users to accurately quantify amount and duration of use.  I’m sure those aren’t issues unique to University of Delaware students.

    So your next study is going to involve placing monitoring software on participants’ computers AND phones (AND tablets), right?  🙂  (I say that with a smile but it actually is a study that really needs to be done.  Weirdly intrusive monitoring apps for everyone!)

  • Kevin, I like how you haven’t read the paper yet you know what my next step is 🙂 That study is going to require a lot more resources! 

    In the paper, I discuss how students may interpret a general question to include both active and inactive time on the site. And of course, I go into detail about how the monitor quantifies Facebook use. 

  • Yeah, it sounds we’re on the same page.  I look forward to reading the paper (instead of commenting on blogs while I procrastinate from some tedious data entry and analysis).  One of the specific adjustments we made to our survey questions was to add the word “actively” when asking about Internet use.  It seemed to clear up that bit of confusion to some degree.

  • Han Woo Park

    Great! I have been critical of traditional survey techniques. You might like to take a look at my presentation. 


  • Stan Dura

    Very thoughtful study, Rey! Given the reliability of their over-estimation, I wonder if students would be similarly reliable if asked about their time using FB on their computer as well as overall and how much closer it would be to actual time used. It might be a useful question while preparing to monitor other devices as well.

  • Stan Dura

    This may be a little “pie in the sky” but if it would be possible to work with FB (and other SM) to get data from the source, that would cover all potential devices.

  • I think why there is a big difference between actual and reported use of Facebook is because the research focused mainly on the presumption that students only use their computers to access the website as seen by the fact that the measurement of actual usage Facebook was done by inserting a monitoring software on their computers only. You see people especially students get into Facebook even when they are in their beds or relaxing outside and all this is because they use their mobile phones meaning that no matter where they are even when sitting on toilet one can access Facebook. This is because when students were answering the questions they based their answers on overall time spent on Facebook irrespective of the device used. Therefore when they say they spend all that time on Facebook they based their answers on all the devices they can use to access the internet.
    Therefore my suggestion would be that when conducting a similar research in future the research questions should also try to determine which devices/gadgets students use to access Facebook and how long they access it using each device. This way it will be easier to ascertain the actual time spent on Facebook and be able to measure actual difference.

  • Beth A. Sayre

    Not sure if you’ll even see my comment, but I am trying to use your study for a project in statistics. Is there anyway you could tell me what your linear regression line was? I am really wanting to use this study, as I find the results very interesting. I would sincerely appreciate it if you could tell me you linear regression line! Thank you!

  • Hi Beth – That information is in the paper available for download here: http://reyjunco.com/wordpress/pdf/JuncoFBActualvsSRCHB2013.pdf

Creative Commons License
Unless otherwise specified, all content on this blog is licensed under a
Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

This site is using the Junco Child-Theme, v2.0.2, on top of
the Parent-Theme Desk Mess Mirrored, v2.0.4, from BuyNowShop.com.