Mixed-Methods Study to Understand Public Use of Social Security's Online Platform
Social Security Bulletin, Vol. 83 No. 4, 2023
Since 2012, the Social Security Administration has offered online my Social Security accounts to provide a key informational resource to the public. Yet the number of my Social Security accountholders remains lower than the agency had hoped for. We conducted a mixed-methods study involving quantitative analysis of survey data and qualitative analysis of personal interviews to examine potential barriers to my Social Security access and to evaluate account users' experiences. The quantitative analysis shows that lower levels of internet literacy and educational attainment are barriers to accountholding and use. Our qualitative findings suggest that my Social Security can be useful in retirement planning, especially for younger adults, by filling knowledge gaps and correcting mistaken expectations. Further research can address ways to minimize or eliminate barriers to my Social Security access and use, and explore how to maximize its effectiveness in supporting retirement readiness and Social Security literacy.
The authors are with the University of Southern California Center for Economic and Social Research.
Acknowledgments: The research reported herein was derived from activities performed pursuant to a grant from the Social Security Administration (no. UM21-08, GR1055785) funded as part of the Retirement and Disability Research Consortium.
The findings and conclusions presented in the Bulletin are those of the authors and do not necessarily represent the views of the Social Security Administration.
Introduction
SSA | Social Security Administration |
UAS | Understanding America Study |
Knowledge about Social Security is critical to workers and their families. Well-informed individuals tend to make better financial decisions and prepare more effectively for retirement (Chan and Stevens 2008; Mastrobuoni 2011; Bhargava and Manoli 2015). Incomplete information about Social Security benefits and program rules may result in suboptimal decisions, such as claiming retirement benefits too early or too late to maximize likely lifetime benefit amounts.
The Social Security Administration (SSA) operates an extensive information outreach program. The Social Security Statement, containing general facts about the program along with individualized earnings histories and future benefit projections, has been the agency's primary channel for providing information to the public since its introduction in 1995 (Smith 2020). Research has shown that the Statement increases workers' knowledge about their Social Security benefits (Mastrobuoni 2011; Smith and Couch 2014; Sass 2015) and informs their planned claiming ages (Armour 2020) and Disability Insurance application decisions (Armour 2018).
In the late 1990s and the 2000s, SSA sent the Statement to all covered workers via annual mailings. Although budget constraints led the agency to scale back annual mailings beginning in 2012, SSA established my Social Security, an online portal for the public providing access to general and individualized program information. By signing up for a my Social Security account on the agency's website, users have a single point of access to many SSA electronic services and can obtain information about their own benefit entitlements—including their latest Social Security Statement, with their earnings history and personalized estimates of future benefits. Users are also able to conduct transactions online, such as requesting a replacement Social Security card, changing personal information, or applying for benefits (via a link) without calling or visiting a Social Security office.
This multipurpose platform therefore has two main advantages. First, it offers users significant time savings compared with seeking information or conducting transactions in person or by phone. Individuals can access my Social Security from anywhere that provides internet service. Second, my Social Security provides personalized information about key aspects of financial and retirement planning. This may be especially critical in the context of low Social Security and retirement-planning literacy among Americans. For instance, 63 percent of adult survey respondents feel that they are not knowledgeable about what their retirement benefits will be (Yoong, Rabinovich, and Wah 2015). Carman and Hung (2018) also document low Social Security literacy.
The increasing availability of personal devices with internet access expands the potential reach of online financial education resources that can quickly provide useful information. Lusardi and others (2017) examine innovative online financial education tools and observe that their potential effectiveness depend on ease of access and efficiency while requiring a low time commitment of their users. The my Social Security portal is a key source of information that is critical to financial well-being, and it meets those criteria. Yet the number of people who have opened a my Social Security account is lower than SSA had hoped for.
To our knowledge, no research has studied the low level of public engagement with my Social Security. To address this gap, we conducted a mixed-methods study, combining qualitative and quantitative data collection and analysis, to examine the perceived and actual barriers to use of my Social Security, along with the experiences reported by my Social Security accountholders. We hope the findings suggest ways to increase participation, enhance Social Security literacy, and enable the public to optimize their retirement planning and decisions.
General Approach
In phase 1 of our study (quantitative data collection), we used existing data from surveys administered through the University of Southern California's Understanding America Study (UAS), a probability panel of more than 8,000 respondents recruited using address-based sampling (Alattar, Messel, and Rogofsky 2018).1 The UAS panel is representative of the U.S. population aged 18 or older. After joining the panel, individuals are invited to take, on average, two surveys each month. Surveys are administered online (a tablet, broadband internet access, and training are provided for individuals who need them).2 Respondents are compensated $20 for a 30-minute survey (proportionally less for shorter surveys).
Two recurring surveys fielded to all UAS panelists every 2 years respectively measure respondents' Social Security literacy and their preferred sources of information about Social Security and retirement. These surveys include questions on awareness and use of my Social Security. Wave 3 of the Social Security program knowledge survey also measured respondents' internet literacy and the types and frequencies of their online activities. We used the quantitative data from these surveys for two distinct purposes: to analyze the determinants of my Social Security account usage and to recruit follow-up interviewees in a procedure described below.
For phase 2 (qualitative assessment of users' experience with the platform), we aimed to ensure that our interview subjects were diverse in terms of internet literacy, current usage of my Social Security, and Social Security beneficiary status. These variables were chosen because we anticipated that they would significantly affect individuals' perceptions of, awareness of, and experiences with my Social Security. Beneficiaries and nonbeneficiaries are likely to be interested in different aspects of my Social Security. For instance, nonbeneficiaries may profit from learning about their expected benefits, whereas beneficiaries may want to use the account to set up direct deposits or obtain a benefit verification letter, among other purposes.
We recruited 24 participants for phase 2 of our study. We chose 24 as our sample size because qualitative research literature suggests that data saturation—the point at which no new themes emerge from the data—is often achieved after as few as 10 to 20 interviews, depending on the type of population under investigation (Hennink, Kaiser, and Marconi 2017; Morgan and others 2002; Francis and others 2010; Guest, Bunce, and Johnson 2006; Namey and others 2016). We did not set out to understand how often these issues are found in the population, but rather the range and type of issues that may emerge in individuals' interactions with my Social Security. We sought adequate sample sizes both of platform users, whom we could ask about their experiences, and nonusers, whose reactions to first platform contact we could observe. Hence, we stratified respondents by accountholder status. To analyze whether internet literacy is an important determinant of my Social Security experiences, we also sought sufficient numbers of interviewees with levels of internet literacy both above and below the median.
To that end, we created an internet literacy index. The 2020 wave of the UAS Social Security program knowledge survey included a set of 35 questions designed to build a measure of internet literacy, which we adapted from the Internet Skills Scale developed by Van Deursen, Helsper, and Eynon (2016). For these questions, respondents reported their ability in a number of online tasks, such as downloading files, filling online forms, changing privacy settings, bookmarking a website, and downloading applications (“apps”) to a mobile device. Using a technique called principal component analysis (PCA), we created an internet literacy index comprising 35 weighted variables.3
To ensure adequate variation in the characteristics of interest in our sample, we used a stratified selection process. First, we divided the entire sample from the UAS Social Security literacy survey into eight groups, determined by the intersection of three binary variables: above or below the median level of internet literacy; my Social Security accountholder status; and Social Security beneficiary status. Then, we randomly chose three people from each group for interviews. The invitation to participate included a screening question eliciting participants' willingness to log into or open their my Social Security account online during the interview, a requirement for the qualitative assessment of users' experiences with the platform.
All interviews were conducted by phone during May and June 2021, with participants required to have their laptops or tablets and internet access ready. The interview consisted of two segments. First, participants were asked about their prior interactions with SSA, their online habits, and, for my Social Security accountholders, their experience opening the account. In the second segment, accountholders were asked to log into their account and answer a series of questions about their experience as they navigated various elements within the platform. By contrast, respondents without an account were asked to create one during the interview, then asked to answer a similar series of questions about their impressions of the platform.
All interviews were recorded and transcribed verbatim. We employed thematic analysis of the transcripts, a technique that focuses on description and interpretation of narrative materials (Braun and Clarke 2006; Thomas 2006). That process began with the development of a preliminary codebook, based on the research questions with which we coded the raw interview data. New codes were developed inductively; that is, as themes emerged through review of the data. Ultimately, we generated 41 individual codes, which allowed us to identify major manifest and latent themes, key concepts, areas of divergence, and connections between messages inherent in the raw data.
The study approach received ethics approval by the University of Southern California's Institutional Review Board. Participants in the qualitative interviews, who provided informed consent both at the time of recruitment and at the start of the interview, were compensated $40 for participating.
Quantitative Study
We had two main goals for the quantitative part of the study. The first goal was to identify and measure the factors affecting my Social Security use and the usage patterns in the most recent years. Analyzing the correlates of my Social Security usage could shed light on barriers to expansion of the platform's reach. The second goal was to gather information on current Social Security beneficiary status, internet literacy, and my Social Security use, which we could use to ensure sufficient diversity among participants selected for our qualitative study.
Data
From the UAS, we used data mainly from the first three rounds of two longitudinal surveys, formally titled What do People Know about Social Security4 and Retirement Planning,5 with the latter survey focusing on how respondents “get and/or would prefer to receive information on retirement planning from [SSA] and other sources” (Rabinovich, Perez-Arce, and Yoong 2022). Hereafter, we refer to these as the What People Know and Information Channels surveys, respectively. Because UAS panel membership increased during the study period, every follow-up wave included both respondents who had answered prior survey rounds and new panelists who were participating for the first time.
The first three rounds of the What People Know survey were conducted during the period 2015–2021. This survey, covering respondent knowledge of Social Security programs and about retirement in general, includes questions about intended retirement and benefit-claiming ages. The third round also included a battery of questions intended to measure internet literacy and use.
The first three rounds of the Information Channels survey were also conducted during 2015–2021. It covers preferred means of receiving information and contacting SSA field offices (internet, regular mail, phone, or in-person visits); receipt of the Social Security Statement; and my Social Security accountholder status.
Other UAS surveys (including modules containing questions from the University of Michigan's Health and Retirement Study) collect information on a broad range of related topics such as retirement income from Social Security benefits and other sources. The UAS Comprehensive File compiles the data from the Social Security and related surveys. We used data from the June 2021 update of the Comprehensive File.6
Outcome Variables
To gauge the extent of my Social Security awareness, accountholding, and use, we looked at responses to three specific questions in the Information Channels survey:
- Have you previously heard about my Social Security?
- Have you set up a my Social Security account?
- Have you ever used my Social Security to do any of the following? Please select all that apply:
- Track and verify your earnings;
- Get a replacement Social Security card;
- Get an estimate of future benefits;
- Get a letter with proof of benefits;
- Change your personal information such as address;
- Start or change your direct deposit;
- Get a replacement Medicare card;
- Get a replacement SSA-1099 or SSA-1042S;
- None of the above.
We constructed awareness and accountholding variables, respectively, as indicators of affirmative responses to the first and second questions above. To proxy for the extent of account use, we constructed the frequency of use variable by counting how many of the activities were selected in the response to the third question. Frequency of use is coded zero if the respondent does not have an account. To construct these variables, we restricted our analysis to results of the third wave of the Information Channels survey.
Predictor Variables
We explored the extent to which demographic variables such as age, sex, race, ethnicity, and education may determine my Social Security awareness, accountholding, and use. To measure education, we used either a dummy variable indicating that the respondent attended college, or a variable measuring number of years of education. We also used a beneficiary variable to indicate whether the respondent currently receives or recently received Social Security benefits. To identify the extent to which limited internet literacy inhibits my Social Security access and use, we used the internet literacy index described in the preceding section.
Quantitative Results
Using wave 3 of the Information Channels survey (UAS 238, fielded in April 2020), we found that 81 percent of U.S. adults do not have an account and have never used my Social Security, while 19 percent have used it at least once (not shown). Among account users, 44 percent have conducted only one activity on the platform, 32 percent have conducted two activities, and 24 percent have conducted three or more activities (Chart 1).
Number of activities | Percent |
---|---|
1 | 44 |
2 | 32 |
3 | 15 |
4 | 6 |
5 or more | 3 |
Table 1 shows the unweighted distributions of our sample respondents by age group, sex, race and ethnicity, Social Security beneficiary status, educational attainment, and my Social Security accountholder status. The age groups are fairly evenly represented, as are individuals with and without a college degree; women are overrepresented in this sample.
Characteristic | Percent |
---|---|
Age | |
18–29 | 19 |
30–39 | 13 |
40–49 | 20 |
50–59 | 18 |
60–69 | 17 |
70 or older | 12 |
Sex | |
Women | 59 |
Men | 41 |
Race and ethnicity | |
Hispanic (any race) | 18 |
Non-Hispanic— | |
White | 63 |
Black | 8 |
Other race a | 11 |
Social Security beneficiary | |
Yes | 26 |
No | 74 |
Education | |
Has bachelor's degree | 40 |
No bachelor's degree | 60 |
my Social Security accountholder | |
Yes | 19 |
No | 81 |
Sample size | 3,913 |
SOURCE: Authors' calculations based on UAS238. | |
NOTE: Rounded components of percentage distributions do not necessarily sum to 100. | |
a. Includes Asian, Native American, Pacific Islander, and multiracial. |
Determinants of Awareness and Usage
To identify factors that may explain my Social Security awareness and usage, we used regression models that account for general demographic characteristics and include variables that may indicate barriers to platform use such as limited internet literacy. We began by using probit models and results from the most recent wave of the two surveys to calculate equation 1, where Yi represents the dependent variable (my Social Security awareness or accountholding) and Xi represents the vector of dependent variables for individual i (which include age, sex, education, internet literacy and usage, and beneficiary status), and εi is the error term:
Table 2 shows the regression estimates from two probit models for determinants of my Social Security awareness. People with higher levels of internet literacy are 2.8 percentage points more likely than others to be aware of the platform's existence, and beneficiaries are 6.6–7.5 percentage points more likely to be aware of my Social Security than nonbeneficiaries. Even though the regression models account for beneficiary status, age is still a significant determinant of awareness: From the sample mean, an additional year of age would be associated with a 0.8 or 0.9 percentage-point increase in awareness. Neither sex nor household earnings is a significant predictor of awareness, but personal earnings level is, suggesting that higher earners are also more likely to be aware of my Social Security.
Independent variable | Model 1 (incorporating five independent variables) | Model 2 (incorporating ten independent variables) | ||
---|---|---|---|---|
Coefficient | Standard error | Coefficient | Standard error | |
Binary variables | ||||
Internet literacy | 0.028*** | 0.003 | 0.028*** | 0.003 |
Social Security beneficiary | 0.066*** | 0.023 | 0.075*** | 0.024 |
Women | -0.010 | 0.016 | -0.006 | 0.017 |
Race and ethnicity | ||||
Hispanic (any race) | . . . | . . . | 0.045 | 0.028 |
Non-Hispanic Black | . . . | . . . | 0.066** | 0.033 |
Other race (non-Hispanic) a | . . . | . . . | 0.033 | 0.030 |
Incremental variables | ||||
Age | 0.008*** | 0.001 | 0.009*** | 0.001 |
Years of education | 0.000 | 0.004 | 0.000 | 0.004 |
Earnings | . . . | . . . | 0.031* | 0.016 |
Household income | . . . | . . . | 0.000 | 0.005 |
Observations | 3,915 | 3,901 | ||
Pseudo R2 | 0.055 | 0.057 | ||
SOURCE: Authors' calculations based on UAS238. | ||||
NOTES: Omitted reference categories for binary variables are low internet literacy, Social Security nonbeneficiary, men, and non-Hispanic White, as applicable.
Intervals for incremental variables are 1 year for age and years of education and $10,000 for earnings and household income.
. . . = not applicable.
* = statistically significant at the p < 0.10 level; ** = statistically significant at the p < 0.05 level; *** = statistically significant at the p < 0.01 level.
|
||||
a. Includes Asian, Native American, Pacific Islander, and multiracial. |
Table 3 repeats Table 2 for my Social Security accountholding. It shows that internet literacy and beneficiary status are statistically significant predictors of having an account. The coefficient for internet literacy implies that a respondent whose proficiency is one standard deviation above the mean is 2.7 percentage points more likely to have an account. Likewise, the probability of a Social Security beneficiary having an account is 4.5 percentage points higher than that of a nonbeneficiary. Age and education are also significant predictors, with an additional year of age and an additional year of education each predicting an increase of about 0.1 percentage point in the probability of having an account.
Independent variable | Model 1 (incorporating five independent variables) | Model 2 (incorporating ten independent variables) | ||
---|---|---|---|---|
Coefficient | Standard error | Coefficient | Standard error | |
Binary variables | ||||
Internet literacy | 0.027*** | 0.000 | 0.027*** | 0.000 |
Social Security beneficiary | 0.045** | 0.020 | 0.047** | 0.020 |
Women | 0.003 | 0.010 | 0.004 | 0.020 |
Race and ethnicity | ||||
Hispanic (any race) | . . . | . . . | 0.013 | 0.030 |
Non-Hispanic Black | . . . | . . . | 0.014 | 0.030 |
Other race (non-Hispanic) a | . . . | . . . | 0.054* | 0.030 |
Incremental variables | ||||
Age | 0.009*** | 0.000 | 0.009*** | 0.000 |
Years of education | 0.010*** | 0.000 | 0.010*** | 0.000 |
Earnings | . . . | . . . | 0.009 | 0.010 |
Household income | . . . | . . . | -0.001 | 0.000 |
Observations | 3,913 | 3,899 | ||
Pseudo R2 | 0.087 | 0.088 | ||
SOURCE: Authors' calculations based on UAS238. | ||||
NOTES: Omitted reference categories for binary variables are low internet literacy, Social Security nonbeneficiary, men, and non-Hispanic White, as applicable.
Intervals for incremental variables are 1 year for age and years of education and $10,000 for earnings and household income.
. . . = not applicable.
* = statistically significant at the p < 0.10 level; ** = statistically significant at the p < 0.05 level; *** = statistically significant at the p < 0.01 level.
|
||||
a. Includes Asian, Native American, Pacific Islander, and multiracial. |
In Table 4, we present results of similar probit models analyzing the determinants of my Social Security awareness and accountholding, but with the age variable expanded to comprise a series of age-group dummies. The coefficients suggest a roughly linear increase in both awareness and accountholding as age increases. In addition to the probit models, we used linear probability models to generate estimates (not shown) that are qualitatively similar to those shown in Tables 2–4.
Independent variable | Awareness | Accountholding | ||||||
---|---|---|---|---|---|---|---|---|
Model 1 (incorporating eight independent variables) |
Model 2 (incorporating 13 independent variables) |
Model 3 (incorporating eight independent variables) |
Model 4 (incorporating 13 independent variables) |
|||||
Coefficient | Standard error | Coefficient | Standard error | Coefficient | Standard error | Coefficient | Standard error | |
Binary variables | ||||||||
Age | ||||||||
30–39 | 0.125*** | 0.043 | 0.127*** | 0.043 | 0.181*** | 0.054 | 0.185*** | 0.055 |
40–49 | 0.209*** | 0.041 | 0.212*** | 0.042 | 0.280*** | 0.054 | 0.289*** | 0.055 |
50–59 | 0.321*** | 0.039 | 0.325*** | 0.040 | 0.415*** | 0.051 | 0.422*** | 0.052 |
60 or older | 0.441*** | 0.037 | 0.454*** | 0.038 | 0.537*** | 0.044 | 0.548*** | 0.044 |
Internet literacy | 0.029*** | 0.003 | 0.029*** | 0.003 | 0.027*** | 0.002 | 0.027*** | 0.002 |
Social Security beneficiary | 0.072*** | 0.023 | 0.082*** | 0.024 | 0.057*** | 0.020 | 0.059*** | 0.021 |
Women | -0.009 | 0.016 | -0.006 | 0.017 | 0.002 | 0.014 | 0.003 | 0.014 |
Race and ethnicity | ||||||||
Hispanic (any race) | . . . | . . . | 0.055** | 0.028 | . . . | . . . | 0.024 | 0.026 |
Non-Hispanic Black | . . . | . . . | 0.069** | 0.033 | . . . | . . . | 0.014 | 0.030 |
Other race (non-Hispanic) a | . . . | . . . | 0.038 | 0.030 | . . . | . . . | 0.060** | 0.028 |
Incremental variables | ||||||||
Years of education | 0.001 | 0.004 | 0.001 | 0.004 | 0.011*** | 0.003 | 0.011*** | 0.004 |
Earnings | . . . | . . . | 0.029* | 0.017 | . . . | . . . | 0.006 | 0.010 |
Household income | . . . | . . . | -0.001 | 0.005 | . . . | . . . | -0.002 | 0.004 |
Observations | 3,915 | 3,901 | 3,913 | 3,899 | ||||
Pseudo R2 | 0.068 | 0.071 | 0.107 | 0.109 | ||||
SOURCE: Authors' calculations based on UAS238. | ||||||||
NOTES: Omitted reference categories for binary variables are ages 18–29, low internet literacy, Social Security nonbeneficiary, men, and non-Hispanic White, as applicable.
Intervals for incremental variables are 1 year for years of education and one standard deviation for earnings and household income.
. . . = not applicable.
* = statistically significant at the p < 0.10 level; ** = statistically significant at the p < 0.05 level; *** = statistically significant at the p < 0.01 level.
|
||||||||
a. Includes Asian, Native American, Pacific Islander, and multiracial. |
One potential concern of using data from panel surveys is that respondents' knowledge and behavior may be affected by their participation in earlier similar surveys—a phenomenon called panel conditioning. Having answered a given question in one or two prior survey rounds could, in principle, affect the response in a current round (although to affect the results of Tables 2 and 3, panel conditioning would have to affect responses differently across the characteristics of interest in our analysis). To assess the extent to which this may have occurred, we reused the probit models of Tables 2 and 3 and calculated separate estimates for wave 3's new and repeat respondents. We found that the results are qualitatively similar, with age, education, internet literacy, and beneficiary status being important predictors of the outcome variables in both subsamples. Table 5 shows the results.
Independent variable | Awareness | Accountholding | ||||||
---|---|---|---|---|---|---|---|---|
Model 1 (repeat respondents) |
Model 2 (first-time respondents) |
Model 3 (repeat respondents) |
Model 4 (first-time respondents) |
|||||
Coefficient | Standard error | Coefficient | Standard error | Coefficient | Standard error | Coefficient | Standard error | |
Binary variables | ||||||||
Internet literacy | 0.025*** | 0.003 | 0.015*** | 0.005 | 0.022*** | 0.002 | 0.026*** | 0.006 |
Social Security beneficiary | 0.071*** | 0.027 | 0.040 | 0.045 | 0.054** | 0.022 | 0.029 | 0.048 |
Women | -0.006 | 0.018 | 0.016 | 0.033 | 0.008 | 0.015 | 0.013 | 0.036 |
Race and ethnicity | ||||||||
Hispanic (any race) | 0.027 | 0.030 | 0.022 | 0.053 | -0.008 | 0.026 | 0.033 | 0.061 |
Non-Hispanic Black | 0.086** | 0.036 | -0.044 | 0.069 | 0.023 | 0.030 | -0.044 | 0.074 |
Other race (non-Hispanic) a | 0.016 | 0.032 | 0.035 | 0.055 | 0.026 | 0.028 | 0.099 | 0.062 |
Incremental variables | ||||||||
Age | 0.008*** | 0.001 | 0.005*** | 0.002 | 0.008*** | 0.001 | 0.009*** | 0.002 |
Years of education | 0.001 | 0.004 | 0.000 | 0.009 | 0.005 | 0.004 | 0.024** | 0.009 |
Earnings | 0.014 | 0.018 | 0.086** | 0.037 | 0.022 | 0.015 | 0.004 | 0.021 |
Household income | 0.004 | 0.005 | -0.026** | 0.013 | 0.001 | 0.004 | -0.009 | 0.014 |
Observations | 3,077 | 824 | 3,077 | 822 | ||||
Pseudo R2 | 0.0598 | 0.0262 | 0.0956 | 0.0517 | ||||
SOURCE: Authors' calculations based on UAS238. | ||||||||
NOTES: Omitted reference categories for binary variables are low internet literacy, Social Security nonbeneficiary, men, and non-Hispanic White, as applicable.
Intervals for incremental variables are 1 year for age and years of education and $10,000 for earnings and household income.
. . . = not applicable.
* = statistically significant at the p < 0.10 level; ** = statistically significant at the p < 0.05 level; *** = statistically significant at the p < 0.01 level.
|
||||||||
a. Includes Asian, Native American, Pacific Islander, and multiracial. |
To measure the frequency of account activity, we used linear models having the same independent variables as the probit models, with the dependent variable being the number of activities conducted.7 The results (Table 6) are shown separately for all respondents—including those without a my Social Security account—and for accountholders only. The latter estimates are useful not only for understanding the factors that affect frequency of use, but also whether they differ from those that affect opening an account.
Independent variable | All respondents | Accountholders only | ||||||
---|---|---|---|---|---|---|---|---|
Model 1 (incorporating five independent variables) |
Model 2 (incorporating ten independent variables) |
Model 3 (incorporating five independent variables) |
Model 4 (incorporating ten independent variables) |
|||||
Coefficient | Standard error | Coefficient | Standard error | Coefficient | Standard error | Coefficient | Standard error | |
Binary variables | ||||||||
Internet literacy | 0.057*** | 0.005 | 0.058*** | 0.005 | 0.036*** | 0.011 | 0.036*** | 0.011 |
Social Security beneficiary | 0.244*** | 0.044 | 0.243*** | 0.045 | 0.352*** | 0.078 | 0.329*** | 0.082 |
Women | -0.025 | 0.031 | -0.025 | 0.032 | -0.099 | 0.065 | -0.112* | 0.066 |
Race and ethnicity | ||||||||
Hispanic (any race) | . . . | . . . | 0.047 | 0.052 | . . . | . . . | 0.080 | 0.122 |
Non-Hispanic Black | . . . | . . . | 0.133** | 0.061 | . . . | . . . | 0.347*** | 0.131 |
Other race (non-Hispanic) a | . . . | . . . | 0.189*** | 0.056 | . . . | . . . | 0.321*** | 0.112 |
Incremental variables | ||||||||
Age | 0.016*** | 0.001 | 0.017*** | 0.001 | 0.005* | 0.003 | 0.007** | 0.003 |
Years of education | 0.018** | 0.007 | 0.019** | 0.008 | 0.005 | 0.016 | 0.006 | 0.017 |
Earnings | . . . | . . . | 0.008 | 0.024 | . . . | . . . | -0.041 | 0.066 |
Household income | . . . | . . . | -0.002 | 0.010 | . . . | . . . | 0.005 | 0.018 |
Constant | -0.563*** | 0.117 | -0.632*** | 0.121 | 1.548*** | 0.274 | 1.433*** | 0.282 |
Observations | 3,913 | 3,899 | 1,051 | 1,047 | ||||
Pseudo R2 | 0.092 | 0.096 | 0.041 | 0.055 | ||||
SOURCE: Authors' calculations based on UAS238. | ||||||||
NOTES: Omitted reference categories for binary variables are low internet literacy, Social Security nonbeneficiary, men, and non-Hispanic White, as applicable.
Intervals for incremental variables are 1 year for age and years of education and $10,000 for earnings and household income.
. . . = not applicable.
* = statistically significant at the p < 0.10 level; ** = statistically significant at the p < 0.05 level; *** = statistically significant at the p < 0.01 level.
|
||||||||
a. Includes Asian, Native American, Pacific Islander, and multiracial. |
Overall, we found that the strongest predictor of more frequent account activity is being a Social Security beneficiary, as it was for platform awareness and accountholding. On average, beneficiaries conduct 0.24 more activities than nonbeneficiaries overall (and, conditional on having an account, they conduct 0.33–0.35 additional activities). Higher internet literacy and educational levels are also important determinants of increased account use. Younger individuals are likely to use my Social Security less frequently than older ones, even when controlling for beneficiary status (not shown).
Trajectories of Platform Awareness and Account Usage
Our use of longitudinal data allows us to study trends in my Social Security awareness, accountholding, and frequency of use. Awareness has increased substantially: From 2015 to 2018, the proportion of respondents who had heard about the platform rose steadily from 21 percent to 34 percent; since then, the proportion has hovered between 29 percent and 37 percent (Chart 2, Panel A).
Year | Panel A: Have heard of my Social Security (%) | Panel B | |
---|---|---|---|
Have an account (%) | Frequncy of use (average number of activities initiated) | ||
2015 | 21 | . . . | . . . |
2016 | 25 | . . . | . . . |
2017 | 29 | 20.7 | 0.39 |
2018 | 34 | 21.0 | 0.38 |
2019 | 29 | 16.8 | 0.30 |
2020 | 37 | 24.5 | 0.46 |
2021 | 33 | 23.1 | 0.42 |
NOTE: . . . = not applicable. |
It is important to reiterate that the samples for later years include new respondents, as the panel grows. Hence the respondent population is not identical across years (although UAS uses weights to maintain the sample's representativeness of the adult U.S. population each year). Although panel conditioning could have been a factor in rising awareness over time, Table 5 showed no significant differences between the new and repeat respondents.
Because the first round of the survey did not include the questions we used to code accountholding and usage, we can track those variables only since 2017. Nevertheless, a slightly upward trend emerges. In the first two years the survey included the questions on accountholding and use (2017 and 2018), about 21 percent of respondents had an account (Chart 2, Panel B). In 2020 and 2021, about 24 percent of respondents had an account. For all respondents, including those without an account, the average number of activities initiated on the platform increased from 0.39 in 2017 and 0.38 in 2018 to 0.46 in 2020 and 0.42 in 2021.
Determinants of Changes in Account Usage
We studied the determinants of change in my Social Security awareness, accountholding, and use by comparing the UAS results from the earliest available and most recent survey waves. Hence, for the awareness variable, we examined changes between the first wave (2015–2016) and third wave (2020–2021) of the surveys, while for the accountholding and frequency of use variables, we examined changes between the second (2017–2019) and third waves.
We used regression models in which the dependent variable is the change in each outcome variable over the study period. In equation 2, and denote the values of the outcome variable in the final and initial period, respectively, includes the independent variables (age, race, ethnicity, education, internet literacy, internet use, and beneficiary status) measured in the baseline wave, and εi represents random error:
We used ordered probit models. For awareness and accountholding, the dependent variable () can take on three values: −1, if the respondent was aware of or had an account in the earlier survey wave and was not aware of or did not have an account in the third wave; 0, if there was no change in status during the period; and 1, if the respondent was newly aware of or had first opened an account as of the third survey wave.
Table 7 shows the determinants of transitions in awareness, opening an account, and frequency of account use. For all outcomes, the coefficients for internet literacy are indistinguishable from zero. By contrast, the coefficient for beneficiary status is significantly below zero for all three outcome variables, showing that the increases have been greater among nonbeneficiaries than among beneficiaries. This may be seen as a positive sign that reach is increasing among the nonbeneficiary population.
Independent variable | Awareness a | Accountholding b | Frequency of use b | |||
---|---|---|---|---|---|---|
Coefficient | Standard error | Coefficient | Standard error | Coefficient | Standard error | |
Binary variables | ||||||
Internet literacy | 0.004 | 0.006 | 0.006 | 0.007 | 0.009 | 0.006 |
Social Security beneficiary | -0.261*** | 0.059 | -0.295*** | 0.065 | -0.289*** | 0.056 |
Women | 0.015 | 0.041 | 0.108** | 0.046 | 0.077* | 0.040 |
Race and ethnicity | ||||||
Hispanic (any race) | 0.065 | 0.068 | 0.049 | 0.076 | 0.001 | 0.066 |
Non-Hispanic Black | 0.012 | 0.080 | -0.009 | 0.089 | 0.142* | 0.077 |
Other race (non-Hispanic) c | 0.034 | 0.074 | 0.257*** | 0.081 | 0.223*** | 0.071 |
Incremental variables | ||||||
Age | 0.004** | 0.002 | 0.008*** | 0.002 | 0.009*** | 0.002 |
Years of education | 0.008 | 0.010 | 0.000 | 0.011 | 0.002 | 0.010 |
Earnings | 0.008 | 0.031 | 0.008 | 0.034 | -0.016 | 0.030 |
Household income | 0.005 | 0.012 | 0.000 | 0.014 | 0.008 | 0.012 |
Observations | 3,846 | 3,842 | 3,842 | |||
Pseudo R2 | 0.005 | 0.010 | 0.007 | |||
SOURCE: Authors' calculations based on various UAS surveys. | ||||||
NOTES: Omitted reference categories for binary variables are low internet literacy, Social Security nonbeneficiary, men, and non-Hispanic White, as applicable.
Intervals for incremental variables are 1 year for age and years of education and $10,000 for earnings and household income.
. . . = not applicable.
* = statistically significant at the p < 0.10 level; ** = statistically significant at the p < 0.05 level; *** = statistically significant at the p < 0.01 level.
|
||||||
a. Differences between survey wave 1 (2015–2016) and wave 3 (2020–2021). | ||||||
b. Differences between survey wave 2 (2017–2019) and wave 3 (2020–2021). | ||||||
c. Includes Asian, Native American, Pacific Islander, and multiracial. |
Accountholding and frequency of use have increased among women. Age is likewise positively related to increases in both awareness and use, suggesting that the growth has been greater among older respondents. These results are based on only a few years, and clearer patterns may emerge over a longer observation period.
Qualitative Study
Table 8 presents summary characteristics of our qualitative study sample. As intended, our sample was evenly split between individuals with and without a preexisting my Social Security account, and between those with internet literacy below and above the median. Although a majority (15) of sample members were Social Security beneficiaries, seven of them were not accountholders prior to the interview (not shown).
Characteristic | Number |
---|---|
Total | 24 |
Sex | |
Women | 11 |
Men | 13 |
Preinterview accountholder | |
Yes | 12 |
No | 12 |
Internet literacy | |
High | 12 |
Low | 12 |
Social Security beneficiary status | |
Retirement benefits | 9 |
Disability program benefits a | 5 |
Other benefits | 1 |
Nonbeneficiary | 9 |
Educational attainment | |
High school diploma or equivalent | 5 |
Some college | 10 |
Bachelor's degree or higher | 9 |
Age (years) | |
Average | 59 |
Youngest | 27 |
Oldest | 81 |
SOURCE: Authors' calculations using UAS231. | |
a. Disability Insurance or Supplemental Security Income. |
Results
Our interviews sought participants' views and experiences with online transactions generally and with my Social Security specifically, and participants' perceptions of the my Social Security platform as they navigated it in real time during the interview. We provide selected direct quotations to convey participants' views and reactions in their own words.
Overall Attitudes Toward Online Transactions
At the start of the interview, we asked participants to tell us about their typical online habits, to help us frame their views of my Social Security in the context of their overall internet activities. Note that all interviewees use the internet to participate in the UAS panel from which they were recruited. This likely introduces some selection bias to our sample, as we had no participants with little or no exposure to and use of the internet. However, even among UAS panel members, we found diverse views on internet usage, including a refusal by some individuals to use the internet for potentially sensitive transactions such as shopping or banking.
Most of our participants reported at least some internet usage beyond UAS participation. Both users and nonusers of online services expressed the importance of privacy and security and acknowledged that conducting online transactions requiring personal information such as bank account or Social Security numbers comes with risks. Nevertheless, more active users of online services accept those risks as inevitable:
[Security and privacy] concern me, but I think it's also in our current environment of working and trying to do some of these things that we have to do. So, I think there's a compromise. Yes, I'm concerned about the level of security, but at the same time I think it's a necessary thing.
Those who did not use the internet for shopping and banking cited two main reasons: first, security and privacy concerns; and second, low internet or computer literacy. One interviewee reported:
I'm old-fashioned. I still believe in keeping cash in my pocket. I don't trust the credit cards. [I only use computers to] look at Facebook. Communicate with my family… That's about it really. I'm really not too good on them.
In addition to privacy concerns and low internet or computer literacy, participants without a my Social Security account prior to the interview cited two additional reasons for not having engaged with the platform before. First, some participants hadn't known it existed:
When I got married, I had to change my name legally on my [Social Security] card—but I physically went to the office. I did not know that online was an option. If I knew, I would have done it.
Second, some individuals did not consider it necessary to create an account because they had no need for the information and services available on the platform—although some recognized that they might in the future:
I'm getting closer to the age, so I kind of want to see where my benefits are as it gets closer. I want to see…how much longer I have to work. Knowing myself, I won't do it until I'm like 55 and 10 years from retirement and see what I need to do to make things better.
Prior Interactions with SSA
Participants who had a my Social Security account prior to the interview reported creating the account under three broad circumstances: (1) when filing for disability, retirement, or survivor benefits; (2) when seeking information to prepare to file for benefits; and (3) when requesting new or replacement documents. Some interviewees created the account for a specific purpose (for example, to obtain a replacement card) and have used the account infrequently or not at all since then. Others reported using it more regularly (for example, every year), to check benefit amounts, payment dates, or the accuracy of their recorded earnings history:
We used to get paperwork where Social Security would send you letters about how much money you could expect and how much money you had made before we retired. So that's when I had called and made an [online] account and left it at that. [Since then] I haven't used my account online, for probably nothing really.
Participants who did not have an account before the interview tended to have relatively few prior interactions with SSA, even though some of them were beneficiaries. These participants either had not yet needed to interact with SSA or had done so only once or twice, in person or by phone. Some of these participants said that they had never heard of my Social Security and did not consider it to be relevant or useful to them yet.
Logging Into or Creating a my Social Security Account
Following the broader discussion about online habits and prior interactions with SSA, we asked participants to log into their my Social Security account or create one if they did not already have one.
For some participants, creating the account or logging into an existing account was straightforward and quick, while for others (including some accountholders), the process was more fraught. Some found the validation process (receiving a security code by text message or email) confusing; some no longer had access to registered email accounts; others were confused by complex identification requirements or other issues. Three participants were unable to log into their accounts, and another decided not to proceed with creating an account during the interview.
Those who created or logged into my Social Security noted that even if the process was easy for them, it may be too complex for others. They commented that some degree of comfort with computers may be a prerequisite for successful signups, especially among people who may have trouble with obtaining the security code for access to the platform, which involves checking email or receiving a code by phone, and thus may require using two devices.
Platform Layout
Among the 20 participants who were able to access their accounts, the majority reported satisfaction with the layout and visual aspects of the site. Overall, participants expressed a preference for the platform's lean and simple style over more “bells and whistles” like those they might find on commercial websites. Nevertheless, participants noted that the platform would benefit from better signposting for certain important features, such as Medicare-related information and the sliding scale for the retirement estimator, which went unnoticed by several participants until they were prompted.
During the interview, participants were also asked to find specific items of information on the site, such as benefit eligibility information or how to request a verification letter or replacement documents. When asked to rate the ease of finding the information, most rated it 1 or 2 on a scale of 1 to 5, with 1 being very easy and 5 being not at all easy. Participants typically needed well under a minute to find the various items of information on the platform.
One notable exception, however, was Medicare information, which several people had trouble finding on the platform. In fact, when prompted, a number of participants said that my Social Security (or SSA more generally) would not be where they would have thought they could find Medicare information or conduct Medicare-related transactions in the first place.
This just says Social Security. I've never seen anything in here that talks about Medicare. That's a different department.
I most likely would not go to a Social Security site to look for a Medicare card replacement. That wouldn't be the first—I wouldn't even go to that site. I'd probably Google it first.
Clarity of the Information
Participants said that most of the functions they sought on the platform, such as finding basic benefit eligibility information, application links, and how to replace documents, were straightforward and clear. Nevertheless, some of our preretirement participants were dissatisfied with the information available on two particular topics: (1) the interaction between benefits and pensions, and (2) the interaction between retirement benefits and spousal/survivor benefits. For this type of information, participants wanted a clear way to estimate optimal claiming behavior, which they did not feel the platform afforded them:
I'm going to have to look around here. They explained to me that it might not even benefit me [to claim retirement benefits] when I am 62, that if I'm going to get more money from myself or it's just going to be about the same as me getting it from survivor benefits. So that's exactly what I'm interested in now…I actually don't see where it just says that here.
Usefulness/Relevance of the Information
Finally, participants found the information on the platform to be relevant to their circumstances. This was true for current beneficiaries and nonbeneficiaries alike, as well as for those with and without an account before the interview. Nonretirees particularly appreciated the retirement benefit information, some of which was a surprise to them. One participant, for instance, did not realize that his full retirement age was 67 (he had assumed it was 65). Another realized he qualified for Social Security benefits only while checking his account during the interview:
It's pretty cool, because I remember the last time I checked, I didn't have enough quarters [of coverage]. Nothing has changed that I've been aware of, because it wasn't like I worked a year and forgot about it, and then they added that information in. And, I mean, it's right there. There's no hard search, and it's written in a simple way that I think most people will be able to understand if they're trying to. Now that I have this, I will look into it further to find out exactly what's going on.
A few said that the amounts indicated in the platform's benefits estimator were lower than they had expected:
You can see the difference in the benefit amount, as you start thinking about if you want to retire earlier in life or later in life. It'll definitely make me think about my financial situation…. Putting some money aside in some sort of a retirement account… If I were to retire at full retirement age at 67, it gives me my benefit amount, and that is not quite nearly enough to survive on. It's significantly less than the rent or mortgage. So yeah, you can't really count on that.
When I see the verbiage right away, “Your spouse's decision on when to begin this benefit can impact the amount of their spousal benefit.” So, then I'm thinking “oh my gosh, she's five years older. What is that going to do to me if she is going to retire earlier?” It kind of makes me go “oh, you know, I need to really look into that.” It makes me right away think “oh gosh, I had no idea. I did not think that”… [It's] just a little bit of a reality check.
Retirees felt that access to my Social Security was good to have, although the platform was not as needed once they started receiving their benefit payments. Some participants, especially older ones, said they would like to see more information resources for financial well-being, such as articles or links to other resources. A few of the retirees who did not have an account prior to the interview said it would have been helpful when they started getting ready to retire.
Conclusions
Our mixed-methods exploration of my Social Security awareness and user experiences yields revealing results. The quantitative analysis suggests that lower internet literacy, and lower educational levels in general, are barriers to my Social Security use. This is important because groups with lower educational attainment may benefit most from the types of information available through the platform. This analysis also suggests that people learn about my Social Security primarily when they become beneficiaries, and as a result, users tend to be older. However, younger groups are typically more internet-literate, and thus possibly better able to take advantage of the platform's features. Moreover, as our qualitative results suggest, younger participants are likely to view my Social Security as a useful financial planning resource.
In our qualitative study, interviewees reported four key reasons for not creating a my Social Security account: (1) lack of awareness of the platform; (2) no perceived relevance/need; (3) security and privacy concerns; and (4) low internet/computer literacy. The latter factor in particular echoes the quantitative finding that low internet literacy inhibits access to and use of the platform. We also observe that, overall, the my Social Security platform is perceived to be clear, navigable, and relevant. Nonretired, nonbeneficiary participants view the information on the platform as instructive and useful. Retirees appreciate but do not have as much use for the platform, although some note that it would have been a useful resource when they were preparing to retire and file for benefits.
Both our quantitative and qualitative evidence show that many individuals start using the platform during or after the benefit-claiming process. Yet our findings imply that my Social Security could be better targeted to, and its retirement preparedness features enhanced for, younger adults (who, our quantitative analysis shows, are less likely to have an account). Our interview sample included 15 nonretirees (some of whom were receiving Social Security benefits other than retirement). The interviews provide evidence suggesting that, in addition to bridging knowledge gaps, my Social Security could help address some behavioral barriers to retirement preparedness or financial planning, such as procrastination, overconfidence, and wariness of the complexity of program rules and information (Kopanidis, Robinson, and Shaw 2016; Blanco, Duru, and Mangione 2020; Choi and others 2006; Benartzi and Thaler 2007; Beshears and others 2013).
The interviews clearly show that the information available at my Social Security can provide a needed jolt of knowledge and correct mistaken expectations (as seen in the reactions of those who had assumed an earlier full retirement age or higher benefit amounts for themselves). It can also serve a critical educational purpose (as seen for those who learned that benefit claiming ages affect survivor and spousal benefit amounts). At the very least, the fact that most Americans claim Social Security retirement benefits at or before their full retirement age (Shoven, Slavov, and Wise 2018) highlights the need for increased awareness of the implications of early claiming.
In a context of low financial literacy and Social Security program knowledge, especially among younger adults, a widely accessible, clear, and personalized information resource could play an important role in both improving financial literacy and supporting financial planning and decision-making. In fact, based on the literature, the platform already meets several of the criteria for effective financial literacy interventions, including clarity and conciseness (Gruber and Orszag 2003; Rabinovich and Perez-Arce 2019), consequence messaging (which is in essence provided by the retirement-benefit estimator) (Samek, Kapteyn, and Gray 2022; Samek and Sydnor 2020), and accessibility and scope (Lusardi and others 2017).
We also find that a key challenge to expanding my Social Security accountholding and use involves the initial capture of users; that is, getting people to create an account. Once they create an account, participants seem broadly happy about how the platform works and what it does. Yet both our quantitative and qualitative findings show that some participants face important barriers to my Social Security account creation, notably low internet literacy. How to address these barriers remains an important question. Nevertheless, over time, internet literacy will improve among older adults (who today are the younger groups with higher levels of internet capability), and this particular barrier should diminish automatically.
This study strongly indicates that further research should address ways to reduce the barriers to using my Social Security, increase public engagement with the platform, and realize its potential as a key resource to support retirement readiness and general financial literacy in nonbeneficiary populations.
Notes
1 Address-based sampling mitigates selection problems facing convenience or “opt-in” panels, whose respondents are recruited from among current internet users only. Chang and Krosnick (2009) and Yeager and others (2011) find evidence that address-based samples are better able to match a population's demographics than nonprobability or random digit-dialing telephone surveys. Prior research has shown that UAS results are close to nationally representative, as benchmarked against well-established surveys (Angrisani, Finley, and Kapteyn 2019).
2 Although respondents to whom UAS provides devices and internet access may increase their internet use and proficiency over time, studies have found that about half of those households continue to be internet nonusers. Even the UAS households that begin to be internet users still make less use of it than those who had prior internet access, and they tend to restrict their use to simple applications (Leenheer and Scherpenzeel 2013). Hence, in providing a tablet and internet access to households with no prior access, the UAS and similarly designed probability panels are likely to be substantially more representative than typical nonprobability panels, which do not.
3 We also conducted a separate analysis in which we calculated an alternative internet literacy variable: the simple mean of the variables in the modules (after changing the sign of the values so that in all cases a higher number represents more knowledge). The correlation between this index and the PCA-based index was 0.994. Given the strong similarities, we used only the PCA-based index.
4 Specifically, UAS 16 (https://uasdata.usc.edu/survey/UAS+16), UAS 94 (https://uasdata.usc.edu/survey/UAS+94), and UAS 231 (https://uasdata.usc.edu/survey/UAS+231).
5 Specifically, UAS 26 (https://uasdata.usc.edu/survey/UAS+26), UAS 113 (https://uasdata.usc.edu/survey/UAS+113), and UAS 238 (https://uasdata.usc.edu/survey/UAS+238).
6 The UAS Comprehensive File is produced by the University of Southern California Dornsife Center for Economic and Social Research, with funding from SSA and the National Institute on Aging. The Comprehensive File is continually updated. The version we used was downloaded on July 1, 2022. For more information, see https://uasdata.usc.edu/page/UAS+Comprehensive+File.
7 The most reported activities were “getting an estimate of future benefits,” “tracking and verifying earnings,” and “changing personal information.”
References
Alattar, Laith, Matt Messel, and David Rogofsky. 2018. “An Introduction to the Understanding America Study Internet Panel.” Social Security Bulletin 78(2): 13–28.
Angrisani, Marco, Brian Finley, and Arie Kapteyn. 2019. “Can Internet Match High-Quality Traditional Surveys? Comparing the Health and Retirement Study and Its Online Version.” In The Econometrics of Complex Survey Data: Theory and Applications, edited by Kim P. Huynh, David T. Jacho-Chávez, and Gautam Tripathi (3–33). Bingley (UK): Emerald Publishing Limited.
Armour, Philip. 2018. “The Role of Information in Disability Insurance Application: An Analysis of the Social Security Statement Phase-In.” American Economic Journal: Economic Policy 10(3): 1–41.
———. 2020. “How Did the Reintroduction of the Social Security Statement Change Workers' Expectations and Plans?” Social Security Bulletin 80(4): 23–38.
Benartzi, Shlomo, and Richard Thaler. 2007. “Heuristics and Biases in Retirement Savings Behavior.” Journal of Economic Perspectives 21(3): 81–104.
Beshears, John, James J. Choi, David Laibson, Brigitte C. Madrian. 2013. “Simplification and Saving.” Journal of Economic Behavior & Organization 95: 130–145.
Bhargava, Saurabh, and Dayanand Manoli. 2015. “Psychological Frictions and the Incomplete Take-Up of Social Benefits: Evidence from an IRS Field Experiment.” American Economic Review 105(11): 3489–3529.
Blanco, Luisa R., O. Kenrik Duru, and Carol M. Mangione. 2020. “A Community-Based Randomized Controlled Trial of an Educational Intervention to Promote Retirement Saving Among Hispanics.” Journal of Family and Economic Issues 41(2): 300–315.
Braun, Virginia, and Victoria Clarke. 2006. “Using Thematic Analysis in Psychology.” Qualitative Research in Psychology 3(2): 77–101. https://doi.org/10.1191/1478088706qp063oa.
Carman, Katherine Grace, and Angela A. Hung. 2018. “Social Security Household Benefits: Measuring Program Knowledge.” MRRC Working Paper No. 2018-384. Ann Arbor, MI: University of Michigan Retirement Research Center.
Chan, Sewin, and Ann Huff Stevens. 2008. “What You Don't Know Can't Help You: Pension Knowledge and Retirement Decision-Making.” The Review of Economics and Statistics 90(2): 253–266.
Chang, Linchiat, and Jon A. Krosnick. 2009. “National Surveys Via RDD Telephone Interviewing Versus the Internet: Comparing Sample Representativeness and Response Quality.” Public Opinion Quarterly 73(4): 641–678.
Choi, James J., David Laibson, Brigitte C. Madrian, and Andrew Metrick. 2006. “Saving for Retirement on the Path of Least Resistance.” In Behavioral Public Finance: Toward a New Agenda, edited by Edward J. McCaffrey and Joel Slemrod (304–351). New York, NY: Russell Sage Foundation.
Francis, Jill J., Marie Johnston, Clare Robertson, Liz Glidewell, Vikki Entwistle, Martin P. Eccles, and Jeremy M. Grimshaw. 2010. “What Is an Adequate Sample Size? Operationalising Data Saturation for Theory-Based Interview Studies.” Psychology and Health 25(10): 1229–1245.
Gruber, Jonathan, and Peter Orszag. 2003. “Does the Social Security Earnings Test Affect Labor Supply and Benefits Receipt?” National Tax Journal 56(4): 755–773.
Guest, Greg, Arwen Bunce, and Laura Johnson. 2006. “How Many Interviews Are Enough? An Experiment with Data Saturation and Variability.” Field Methods 18(1): 59–82.
Hennink, Monique M., Bonnie N. Kaiser, and Vincent C. Marconi. 2017. “Code Saturation Versus Meaning Saturation: How Many Interviews Are Enough?” Qualitative Health Research 27(4): 591–608.
Kopanidis, Foula, Linda Robinson, and Michael Shaw. 2016. “‘I'm Not Old Enough!’ Why Older Single Women Are Not Engaging in Retirement Planning Services.” In Rediscovering the Essentiality of Marketing: Proceedings of the 2015 Academy of Marketing Science (AMS) World Marketing Congress, edited by Luca Petruzzellis and Russell S. Winer (871–876). Cham (Switzerland): Springer International Publishing. https://doi.org/10.1007/978-3-319-29877-1_169.
Leenheer, Jorna, and Annette C. Scherpenzeel. 2013. “Does It Pay Off to Include Non-Internet Households in an Internet Panel?” International Journal of Internet Science 8(1): 17–29.
Lusardi, Annamaria, Anya Samek, Arie Kapteyn, Lewis Glinert, Angela Hung, and Aileen Heinberg. 2017. “Visual Tools and Narratives: New Ways to Improve Financial Literacy.” Journal of Pension Economics & Finance 16(3): 297–323.
Mastrobuoni, Giovanni. 2011. “The Role of Information for Retirement Behavior: Evidence Based on the Stepwise Introduction of the Social Security Statement.” Journal of Public Economics 95(7–8): 913–925.
Morgan, M. Granger, Baruch Fischhoff, Ann Bostrom, and Cynthia J. Atman. 2002. Risk Communication: A Mental Models Approach. New York, NY: Cambridge University Press.
Namey, Emily, Greg Guest, Kevin McKenna, and Mario Chen. 2016. “Evaluating Bang for the Buck: A Cost-Effectiveness Comparison Between Individual Interviews and Focus Groups Based on Thematic Saturation Levels.” American Journal of Evaluation 37(3): 425–440.
Rabinovich, Lila, and Francisco Perez-Arce. 2019. “Improving Understanding of the Retirement Earnings Test.” Journal of Pension Economics & Finance 20(4): 496–503. https://doi.org/10.1017/S1474747219000222.
Rabinovich, Lila, Francisco Perez-Arce, and Joanne Yoong. 2022. Understanding America Study. UAS238: Retirement Planning—Wave 3. Survey codebook. Los Angeles, CA: University of Southern California Center for Economic and Social Research.
Samek, Anya, Arie Kapteyn, and Andre Gray. 2022. “Using Vignettes to Improve Understanding of Social Security and Annuities.” Journal of Pension Economics & Finance 21(3): 326–343. https://doi.org/10.1017/S1474747221000111.
Samek, Anya, and Justin R. Sydnor. 2020. “Impact of Consequence Information on Insurance Choice.” NBER Working Paper No. 28003. Cambridge, MA: National Bureau of Economic Research.
Sass, Steven A. 2015. “Does the Social Security ‘Statement’ Add Value?” Issue in Brief No. 15-11. Chestnut Hill, MA: Center for Retirement Research at Boston College.
Shoven, John B., Sita Nataraj Slavov, and David A. Wise. 2018. “Understanding Social Security Claiming Decisions Using Survey Evidence.” Journal of Financial Planning 31(11): 35–47.
Smith, Barbara A. 2020. “Can Informational Interventions Be Effective Policy Tools? An Initial Assessment of the Social Security Statement.” Social Security Bulletin 80(4): 1–22.
Smith, Barbara A., and Kenneth A. Couch. 2014. “How Effective Is the Social Security Statement? Informing Younger Workers About Social Security.” Social Security Bulletin 74(4): 1–19.
Thomas, David R. 2006. “A General Inductive Approach for Analyzing Qualitative Evaluation Data.” American Journal of Evaluation 27(2): 237–246.
Van Deursen, Alexander J. A. M., Ellen J. Helsper, and Rebecca Eynon. 2016. “Development and Validation of the Internet Skills Scale (ISS).” Information, Communication & Society 19(6): 804–823.
Yeager, David S., Jon A. Krosnick, LinChiat Chang, Harold S. Javitz, Matthew S. Levendusky, Alberto Simpser, and Rui Wang. 2011. “Comparing the Accuracy of RDD Telephone Surveys and Internet Surveys Conducted with Probability and Non-Probability Samples.” Public Opinion Quarterly 75(4): 709–747.
Yoong, Joanne, Lila Rabinovich, and Saw Htay Wah. 2015. “What Do People Know About Social Security?” CESR-Schaffer Working Paper No. 2015-022. Los Angeles, CA: University of Southern California Center for Economic and Social Research.