Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

Technology Trends: Keep a Wary Eye on Artificial Intelligence
By Marya Gwadz, PhD, MA, and Amanda Ritchie
Social Work Today
Vol. 22 No. 1 P. 32

Whether or not they know it, the person conversing with their Echo device, searching for the capital of Moldova on Google, or playing hearts on their computer is working with artificial intelligence (AI).

In social work, researchers, too, are engaging with AI and data science to help them illuminate better ways to address many familiar social concerns, including poverty, racial inequities, and mental health.

Can data analytics and AI help social workers pinpoint households where children are at a higher risk of mistreatment? Can understanding the experience of vast numbers of individuals who are out on parole help identify effective ways to support their success in the community? What type of new intake process at mental health clinics works best to retain adolescent clients?

Given the increasing number of studies employing the relatively new tools of data-driven technology, it is apparent that many social work researchers believe in its potential benefits. However, there’s still a fair amount of concern about these tools, and some social workers are focused on addressing the most disturbing aspects, such as the inherent biases that can be found in Big Data.

For now, it’s important to note that the harnessing of technology for social good has been endorsed by the Grand Challenges for Social Work, a framework established for the social work profession by the American Society of Social Work and Social Welfare.

One key goal highlighted by this professionwide initiative is the integration of new technology into social work and the building of capacity to use powerful digital resources to identify and implement solutions to pressing social problems. Examples of digital science’s applicability to social work research have started to crop up. For example, social work researchers are already using AI and natural language processing to detect bias in social media and to develop culturally sensitive algorithms for violence detection and prevention. Other researchers are merging social work and AI to enhance HIV prevention programs for homeless youth, while others are working on predictive risk modeling to help child welfare agencies identify risks for child abuse and maltreatment.

Growing engagement with such tools as Big Data, algorithms, and AI also is reflected in the use of large administrative data sets from public assistance, employment, and social service records, The data gathered from these sources help the profession better understand how specific policy elements affect program participants.

The current use of large child welfare administrative records from multiple states to identify service patterns and outcomes and to make policy recommendations is another promising utilization of the new technology. So, too, is the corralling of open data on housing, neighborhoods, and geographic information systems to empower communities to fight disinvestment and predatory lending and to identify foreclosure risks.

Maintaining a Level Playing Field
While AI has the potential to use computational power and engineering to develop solutions to various problems, it remains incumbent on the social work field to work to ensure that the technological applications are fair and unbiased. Social workers must advocate for and participate in efforts to regulate companies that create and apply AI and algorithmic tools. Significant concerns exist about the potential for inherent data biases to perpetuate bias in child welfare, banking and lending, law enforcement, and health and wellness. The sources of bias in AI are data sets that underrepresent or misrepresent phenomena related to race and gender, as well as algorithms and research methods that amplify biased data.

Furthermore, AI and other data science fields lack adequate representation of women; Black, Indigenous, People of Color (BIPOC); and other marginalized groups, undermining their ability to address issues of fairness, equity, and justice.

Experts are also calling for greater interdisciplinary work in the development and use of AI, especially by social scientists who study the social foundations in which AI operates and engage with marginalized communities most at risk of harm from AI. BIPOC researchers in AI such as Timnit Gebru, who is critical of racism and sexism in facial analysis models, argue for greater diversity in the field and transparency in AI research and data sharing standards. Concerns also exist about a potential overreliance on data science in research.

AI in Action
At the Silver School of Social Work at New York University, researchers are bringing the tools of technology to bear in a variety of areas, including the following:

• Victoria Stanhope, PhD, MSW, is exploring person-centered care, which ensures that behavioral health care is individualized and service users are active, empowered partners in their treatment. During the process, she’s utilizing an AI approach to examine collaborative documentation, a strategy in which behavioral health clinicians complete visit notes jointly with consumers during the session. Stanhope’s study is using natural language processing, a text-mining technique that translates narrative text into structured data with an algorithm to analyze clinical visit notes. It will contribute to the base of evidence on collaborative documentation and develop an algorithm to analyze person-centered care to inform quality improvement in behavioral health care.

• Doris F. Chang, PhD, is examining Asian American responses to racism in the COVID-19 era, and exploring macro-contextual and individual predictors of discrimination, intergroup attitudes, and collective action to address racial inequality. The study examines how regional variations in racial climate—indicated by sentiment analysis of geocoded Twitter data of anti-Asian and anti-Black bias as well as solidarity and allyship across racial groups—are associated with racial discrimination and mental health, intergroup attitudes (structural awareness, sense of belonging, political commonality/coalitional attitudes), and collective action and coalitional support (eg, Asian-Black allyship behaviors). The study is designed to shed light on the multilevel factors that shape Asian Americans’ individual and intergroup responses to racism, and their subsequent civic and political engagement.

• Michael Lindsey, PhD, MSW, MPH, has established an AI hub to help researchers investigate how AI-driven systems can be used to equitably address poverty and challenges related to race and public health, and to provide thought leadership on the implications. The work has been made possible by a $5 million gift from philanthropists Martin Silver and Constance McCatherin Silver.

In addition to these efforts, social work researchers, such as Desmond Upton Patton of Columbia University, sit on advisory boards for social media companies, including Twitter, that use AI and data science. Other social workers are using virtual reality and natural language processing to develop simulations for social work education.

Also, social work schools at Columbia, Washington University, and USC are involved in innovative work with data science and AI topics, further illustrating the technologies’ emergence as important new areas for education and scholarship.

In the end, Big Data isn't a panacea but rather a tool. And social scientists, especially social workers and anthropologists who use qualitative methods, would argue that Big Data systems still need to be integrated with unquantifiable “thick data" to provide the greatest possible context, depth, and richness of information.

— Marya Gwadz, PhD, MA, is a professor and the associate dean of research at New York University Silver School of Social Work.

— Amanda Ritchie is director of Center Operations at New York University’s Constance and Martin Silver Center on Data Science and Social Equity.

 

Resources
Asakura, K., Occhiuto, K., Todd, S., Leithead, C., & Clapperton, R. (2020). A call to action on artificial intelligence and social work education: Lessons learned from a simulation project using natural language processing. Journal of Teaching in Social Work, 40(5), 501-518.

Beimers, D., & Coulton, C. J. (2011). Do employment and type of exit influence child maltreatment among families leaving temporary assistance for needy families? Children and Youth Services Review, 33(7), 1112-1119.

Cancian, M., Han, E., & Noyes, J. L. (2014). From multiple program participation to disconnection: Changing trajectories of TANF and SNAP beneficiaries in Wisconsin. Children and Youth Services Review, 42(C), 91-102.

Drake, B., Jonson-Reid, M., Ocampo, M. G., Morrison, M., & Dvalishvili, D. (2020). A practical framework for considering the use of predictive risk modeling in child welfare. The Annals of the American Academy of Political and Social Science, 692(1), 162-181.

Kingsley, G. T., Coulton, C. J., & Pettit, K. L. (2014). Strengthening communities with neighborhood data. Rowman & Littlefield Publishers.

Larson, A. M., Singh, S., & Lewis, C. (2011). Sanctions and education outcomes for children in TANF families. Child and Youth Services Review, 32(3), 180-199.

Patton, D. U., Frey, W. R., McGregor, K. A., Lee, F. T., McKeown, K., & Moss, E. (2020, February). Contextual analysis of social media: The promise and challenge of eliciting context in social media posts with natural language processing. Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, 337-342.

Piore, A. (2019, January 14). The reality of racism comes to life in VR film. https://news.columbia.edu/news/reality-racism-comes-life-vr-film

Rice, E., Yoshioka-Maxwell, A., Petering, R., Onasch-Vera, L., Craddock, J., Tambe, M., & Wilson, N. (2018). Piloting the use of AI to enhance HIV prevention interventions for youth experiencing homelessness. Journal of the Society for Social Work and Research, 9(4), 551-573.

Wulczyn, F. H., Chen, L., & Hislop, K. B. (2007 December). Foster care dynamics, 2000-2005: A report from the multistate foster care data archive. https://fcda.chapinhall.org/wp-content/uploads/2013/10/Foster-Care-Dynamics-2000-2005.pdf