HGSE Community
FeedResourcesMembers
TP

Community

What we're learning, together

Questions, discoveries, and honest reflections from across HGSE's AI community. Everyone is figuring this out — this is where we figure it out together.

This Week's Question

Week of March 24, 2026

What's one thing you tried with AI this semester that surprised you — either because it worked better than expected, or failed in an interesting way?

We're collecting honest reflections, not success stories. The interesting stuff lives in the unexpected.

AI SummaryBased on 3 responses

A clear pattern is emerging: AI tools work differently depending on the learner's existing confidence level. Confident students use AI to deepen their thinking, while struggling students can feel more alienated. Several responses highlight that the most valuable AI applications are ones that create productive friction — forcing students to engage with ideas they'd otherwise avoid — rather than reducing effort.

TP

Responses (3)

PA

Priya Anand

Ed.M. '26 · TIE · 3 hours ago

I had students use Claude to generate counterarguments to their own thesis statements. Expected them to just copy-paste — instead, several said it forced them to actually understand the opposing view for the first time. The surprise was that the AI made them think harder, not less.

DD

David Dockterman

Faculty · 5 hours ago

Tried using an AI to summarize student discussion posts before our seminar. The summaries were technically accurate but stripped out all the interesting tensions and contradictions. Reminded me that what matters in a discussion isn't the conclusions — it's the friction between ideas.

LZ

Ling Zhang

Researcher · Next Level Lab · Yesterday

Failed experiment: I gave an AI tutor to a group of 4th graders for math practice. The kids who were already confident loved it. The kids who were struggling got more anxious — they said the AI 'didn't understand what they didn't understand.' Huge insight for how we think about scaffolding.

DISCUSSION

4 hours ago

MC

Marcus Chen

Ed.D. Candidate · 4 hours ago

How are you handling AI detection in student work?

Our department just had a heated meeting about this. Some faculty want to use GPTZero on everything, others think detection tools are unreliable and create an adversarial dynamic with students. I lean toward the latter but I don't have a good alternative framework yet. What are you all doing?

AssessmentPolicyEthics

SHARED RESOURCE

6 hours ago

TR

Tomás Reyes

Ed.M. '26 · LDIT · 6 hours ago

AI Tutoring Systems Show Promise but Struggle with Affect

Affective Responses to AI Tutoring in Elementary Mathematics

arxiv.org

This confirms what Ling's team saw in the 4th grade pilot — AI tutors work well for confident learners but can increase anxiety for struggling students. The affective dimension is the missing piece.

ResearchK-12Tools

EVENT RECAP

2 days ago

AH

Amira Hassan

Ed.M. '27 · TIE · 2 days ago

Takeaways from the March AI Hack

We had 24 participants build projects in 2 hours. Highlights: a team built an AI-powered discussion facilitator for seminars, another prototyped a tool that generates culturally responsive math word problems, and a faculty member (who'd never coded before) built a working chatbot for office hours. The energy was incredible — people who usually work in silos were building together.

Continue the conversation

ToolsCommunityPedagogy

DISCUSSION

3 days ago

LZ

Ling Zhang

Researcher · Next Level Lab · 3 days ago

Looking for collaborators: AI-assisted peer review in teacher education

I'm exploring how AI could support (not replace) the peer review process in teacher preparation programs. Specifically: can AI help pre-service teachers give better feedback to each other on lesson plans? Looking for 2-3 people interested in co-designing a pilot. Background in teacher ed, learning design, or AI ethics especially welcome.

ResearchPedagogyHigher Ed

SHARED RESOURCE

4 days ago

JW

James Whitfield

Alumni '24 · Civic Tech · 4 days ago

This NYT piece gets at why the 'AI in schools' debate is stuck

The Schools That Are Teaching Students to Think With AI, Not Just Use It

nytimes.com

The framing is always 'ban it or embrace it' — this article finally names the middle ground: teach students to be critical users. Not revolutionary, but the examples from real classrooms are worth reading.

K-12PolicyPedagogy

EVENT RECAP

5 days ago

DD

David Dockterman

Faculty · 5 days ago

LEAP Quick Lab: Prompt Engineering for Assessment

Key insight from this session: the best assessment prompts aren't about getting AI to grade student work — they're about using AI to generate better questions. We practiced writing prompts that create Bloom's-aligned questions at specific difficulty levels. Three faculty left with working prompt templates they're using this week.

Continue the conversation

AssessmentToolsPedagogy

Upcoming Events

Hack

April AI Hack

Apr 3 · 5:00 PM

Open-ended, 2 hours. Build something with AI for education.

18 going

Workshop

LEAP Quick Lab: AI Feedback Tools

Apr 8 · 12:00 PM

Hands-on session exploring AI-powered feedback for student writing.

12 going

Office Hours

Office Hours with Andrés

Apr 10 · 3:00 PM

Drop in for help with any AI tool or teaching question.

5 going

Active Members

PA

Priya Anand

Ed.M. '26 · TIE

MC

Marcus Chen

Ed.D. Candidate

SO

Sarah Okonkwo

Ed.M. '26 · TIE

DD

David Dockterman

Faculty

TR

Tomás Reyes

Ed.M. '26 · LDIT

LZ

Ling Zhang

Researcher · Next Level Lab

Past Prompts

Week of March 17

What's one AI tool you wish existed for your classroom or research?

18 responses

Week of March 10

How do you talk about AI with people who are skeptical or afraid of it?

23 responses

Week of March 3

Share a resource that changed how you think about AI in education.

14 responses

Community Feed

Design preview — data resets on refresh