The US Office of Evaluation Sciences Releases 2016-2017 Results

A stylized photo of a person holding an evaluation form.

The US Office of Evaluation Sciences (OES), a team of social and behavioral scientists tasked with designing and testing evidence-based interventions within the federal government, has released the results of their most recent intervention studies.

Housed in the US General Services Administration (GSA), OES has partnered with more than a dozen agencies on over 45 projects since its inception in 2015. OES’s overall aim is to employ rigorous evidence-based research methods to identify potential programmatic changes and test their effectiveness in areas of interest to the federal government, such as improving government operations and public programs, an endeavor that APS has helped to propel since 2013. Recognizing the importance of transparency and reproducibility, OES is also committed to making its research tools and processes available for public use.

For one of the projects conducted in 2017, OES partnered with the Office of Personnel Management (OPM) to increase the number of individuals from historically underrepresented groups in the federal applicant pool and hiring process. USAJOBS, the federal government’s employment database managed by OPM, normally gives users the option to submit demographic information voluntarily. However, the rate at which this demographic information is provided remains low, making it difficult to determine the demographic makeup of the federal applicant pool. The OES team posited that perhaps requiring job applicants to provide this information by default, but allowing them to opt out, would lead to higher demographic questionnaire response rates than the usual system, which required applicants to opt in to providing this information. OES also wondered whether using more simplified language describing how the demographic information is used would have a positive effect on submission rates. OES designed a study in which over 2.5 million USAJOBS users were randomly assigned to one of four groups: A control group using the current USAJOBS voluntary opt-in interface, a voluntary opt-out group, a voluntary opt-in with simplified language group, or a voluntary opt-out with simplified language group. According to the OES report, the voluntary opt-out groups were most effective in increasing the submission of demographic information. Moreover, incorporating the simplified language proved most effective.

Not all of OES’s studies reported statistically significant effects. In another study, in collaboration with the Department of Veterans Affairs (VA) Health Eligibility Center, OES investigated whether informational emails regarding VA health care benefits would increase enrollment rates in health care programs. According to the VA, many service members and their families are unaware of their health care benefits, the eligibility requirements, and the enrollment process. In the study designed by OES, one group of randomly-selected transition service members was sent informational emails, whereas another group (a control group) was not. After comparing enrollment rates, OES found that the emails did not result in a statistically significant change in enrollment rates. OES plans to examine other factors that might improve enrollment rates.

In a third study, OES worked with the GSA’s Federal Supply Schedules program, which charges an industrial funding fee to federal vendors to cover operational costs. Vendors must report their sales through an online form, after which the correct fee total is calculated. OES experimented with placing the form’s signature box at the beginning, rather than at the end, to determine whether this change would promote a more accurate self-report of sales, thereby directly affecting the calculation of the fee total. This thinking was based on prior research suggesting that inclusion of a signature box at the beginning, rather than at the end, primes people to be as accurate as possible as they fill out forms. In OES’s study, federal vendors were randomly assigned to a control group using the existing reporting system, or the experimental group, which utilized the modified form. In the initial results of the study, first reported in 2015, the median self-reported sales amount was significantly higher in the quarter immediately following the modification, amounting to over 1.5 million in fees paid to the federal government in one quarter. In subsequent quarters, however, OES did not find statistically significant effects of the location of the signature box. Thus, the study suggests that placing the confirmation signature box at the top of the form may influence the accuracy of self-reports initially, but more research will be needed to determine whether the effects persist over time.

Overall, behavioral interventions, or ‘nudges,’ like the ones implemented by OES, have been found to be effective. In a recent Psychological Science article, researchers identified several policy areas of interest (e.g. healthcare) and reviewed the existing literature for examples of behavioral interventions targeting those areas to then calculate their cost effectiveness. The authors report that nudges yield high returns at a relatively low cost when compared to traditional policy approaches, such as establishing prohibitions or mandates.

APS has played a key role in catalyzing the conversation about the importance of behavioral science in public policymaking. During the 25th Annual Convention in 2013, APS organized a White House workshop focusing on the intersection of psychological science and behavioral economics in the public policy sphere, which brought together behavioral scientists and government leaders, among others. On September 15, 2015, President Obama signed an executive order requiring federal agencies to incorporate behavioral insights into their evaluation efforts.

Following the executive order, President Obama established the White House’s Social and Behavioral Sciences Team (SBST) and the Office of Evaluation Sciences, spurring OES’s many collaborations with federal agencies, such as the Departments of Education and Veterans Affairs, on issues of student loan repayments and workplace savings enrollment, respectively. The SBST/OES first round of results were released in 2016 and received praise from many federal agencies and academics, as APS reported in a November 2016 Observer article. In the Trump administration, OES continues the work that was conducted by both SBST and OES, who is anticipated to present at the 30th APS Annual Convention in San Francisco later this year.

To read more about the SBST/OES and their work, read the October 2015 Observer cover story, “Reaching Citizens Through Science,” here.

The SBST was inspired in part by the UK’s Behavioral Insights Team (BIT), which former Prime Minister David Cameron commissioned in 2010 to test public-policy interventions through randomized controlled trials.

Become an OES Fellow
Interested in joining the OES team? OES is seeking applied social and behavioral science researchers, including psychological scientists, to work as Fellows.

Crystal C. Hall, a psychological scientist and OES Fellow, explains that researchers doing experimental work and who are committed to communicating their research process and results effectively to a general public will be particularly interested in applying. Learn more by clicking here.

Furthermore, the recent emphasis on behavioral science in policymaking points to future opportunities for psychological scientists looking to be involved in more applied work.

“There’s lots of these kinds of little units setting up in different cities and states around the country. There’s still a lot of work that [psychological scientists] have to offer, so I look forward to seeing more and more of this research coming out,” says Hall.

To read more about OES from a psychological science perspective, read the full interview with Crystal C. Hall here.


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.