Editor's Note: This article originally appeared in the December 2025 edition of “The Sway,” the newsletter of The Graduate School at Howard University.
Artificial intelligence is no longer a futuristic concept in academia — it’s here, and transforming the way doctoral students learn, research, and write. But as AI becomes embedded in Ph.D. programs, educators are grappling with a critical question: How do we harness its benefits without compromising the intellectual rigor and authenticity that define doctoral work?
For Dr. Desta Haileselassie Hagos, a lecturer of computer science and the AI/ML Technical Lead Manager at Howard University, the answer lies in balance.
“AI should be treated as an intellectual partner, not a substitute for thinking,” he said. “Its role is to improve how our graduate students search, analyze, simulate, and communicate. But the core intellectual work — framing problems, exercising judgment, making original contributions — must remain the responsibility of the researcher.”
AI’s potential in doctoral education is undeniable. Unlike traditional digital tools or simple search engines, modern AI systems go beyond discrete task automation.
“Modern AI systems are different because they are generative and conversational. They can suggest ideas, question assumptions, and adjust to a graduate student’s level in real time. This makes them powerful, but also more risky at the same time, because they shape how students reason, not just how they execute tasks,” said Hagos.
Gone are the days when graduate students spend hours in the science library doing literature searches. AI has created efficiencies around some of these processes — with searching, summarizing long papers, mapping research themes across many articles, and pointing to related work that a graduate student might overlook.
Hagos takes a deeper dive into AI and graduate education in a Q&A featured in the “The Sway,” the newsletter of The Graduate School at Howard.
The Sway: How can AI tools assist in literature review and data analysis for Ph.D. students?
Hagos: For literature review, AI can help students with searching, summarizing long papers, mapping research themes across many articles, and pointing to related work that a student might overlook. For data analysis, it can assist with cleaning data, proposing appropriate models, generating sample code, and explaining statistical results. But students still need to understand and verify every step themselves instead of completely relying on AI to do the job for them.
The Sway: What impact do you think AI will have on the originality and rigor of doctoral research?
Hagos: I think AI can have two very different effects on originality and rigor. When students use it responsibly, it can actually raise the standard, because they can spend less time on routine work and more time refining their ideas and checking the strength of their methods. But if it is used carelessly, it can lead to work that looks polished on the surface but is not grounded in solid reasoning or evidence. The impact ultimately depends on how clearly programs define expectations and how well they teach students to use AI in a disciplined way.
Read the full Q&A in The Sway.
Keep Reading
-
Artificial IntelligenceHoward University and Bowie State partner on $4M initiative to Build AI Literacy
Jan 15, 2026 4 minutes -
NewsNew AI Courses and Partnerships Signal Latest Push in Howard’s AI Initiative
Jan 14, 2026 4 minutes -
UpliftService, Scholarship, and the Dream in Action: Howard University Inspires the Next Generation on MLK Day
Jan 23, 2026 2 minutes
Find More Stories Like This
Are You a Member of the Media?
Our public relations team can connect you with faculty experts and answer questions about Howard University news and events.
Submit a Media Inquiry