Posted on

2024 Summer Pre-College Creative AI & Design Course

Prompt Battle 1
Our first generative AI prompt battle.

This summer we invited 13 Cambridge, MA youth from diverse backgrounds to explore and learn the fundamentals of generative artificial intelligence or GenAI. This experience created space for youth to bring their identities and interests into creative computing in ways that allows them to see themselves in GenAI. They worked together and addressed questions such as: What is generative AI and where do we see it in the art world? How does working with AI tools affect creativity? Can AI teach us how to be more creative? What are the benefits and harms of GenAI?

This in-person, pre-college course came about from Lesley STEAM’s collaboration with Cambridge Youth Programs. CYP staff recruited the students and youth counselors who assisted our team. Every day for one week, from 9:30am – 5:30pm, CYP youth (students) met at Lesley University’s College of Art and Design or LA+D. The youth earned 2 Lesley college credits and received stipends for their participation. The skills they learned can be applied to other classes they can take such as art and computer science. Students can also explore GenAI applications for entrepreneurship opportunities. 

face sensing AI in scratch
Learning how to use face sensing AI in Scratch.

LSTEAM/CYP facilitated activities such as face-sensing AI using the Scratch programming language, GenAI design thinking, and participating in events using text, image and music GenAI tools. Class activities culminated in capstone “project pitches” and presentations. Main objectives of the class included:

  • Learning the fundamentals of generative artificial intelligence or GenAI.
  • Exploring design thinking for GenAI (empathize, define, ideate, prototype, test)
  • Referencing, applying and combining artistic styles to generate new images, text, and music.
  • Explore bias/anti-bias, copyright and intellectual property issues in GenAI.
Joy Buolamwini talks about racial bias in AI.
Joy Buolamwini talks about racial bias in AI.

We began the course by learning about GenAI that uses machine learning models to create new content that includes images, text, music and video/animation. A machine learning model is a program that finds patterns, or makes decisions from a previously unseen dataset. Facial recognition—identifying and measuring facial features in an image—is one of the more obvious applications of machine learning. Students watched and discussed the film Coded Bias that shows how many facial recognition technologies do not accurately detect certain faces such as darker-skinned women. 

GenAI Art Tools & Daily Challenges

Throughout the course, students had access to and used several GenAI art tools including Adobe Firefly, Deep Dream Generator, ChatGPT, Udio, and PoseNet. Each day focused on a specific GenAI art tool, which uses a large language model or LLM that is used to train a neural network on massive amounts of data. We looked at several videos that explained what AI art is and how it came to be such as this one: AI Art, Explained. Daily challenges provided students with tasks to accomplish. Each morning we opened with a cypher (circle) to give students a chance to share their expectations, interests, and concerns. We also closed with a cypher to share our accomplishments and challenges for the day.

A cypher doesn’t need a stage or designated area in which to take place, they can, and do, form anywhere; at parties, in clubs, outside on the concrete, in train stations, on a beach, in someone’s living room. Spontaneous in its forming, all that [performers] need is space and they will cypher. —Emmanuel Adelekun, Red Bull
Each day students were tasked to read articles such as a blog post written by Nettrice about “BBL Drizzy”, a song that recently made hip-hop history by accidentally sampling an AI-generated song written by @kingwillonius who was featured in WIRED magazine. Next, students learned how to use Udio to create their own AI-generated songs.

Visual Storytelling 3.0

Understanding language is very important when using generative AI tools. Students explored GenAI for visual storytelling, or the art of communicating messages, emotions, narratives and information in a way which reaches viewers at a deep and lasting level. Students learned how to write prompts in ChatGPT to create text for visual metaphors that helped them translate their ideas into more understandable forms. They also used Adobe Firefly and other image generation AI tools that are based on large language models or LLMs to help refine and steer their ideas towards desired results.

visual storytelling 3.0
Nettrice’s Visual Storytelling 3.0 slide.

In order to generate compelling text and images students learned how to engineer prompts. Prompt engineering is a growing field that involves writing text that can be interpreted and understood by a generative AI model. Students learned how to add “boosters” or modifier words to their prompts to create more unique content. They also practiced this skill during prompt battles and when exploring different GenAI tools.

Prompt Battles

Many of the students were very interested in honing their prompt engineering skills, so we engaged them in prompt battles, or events in which people compete against each other using text-to-image GenAI tools to create images based on text prompts. Our events, which were very popular with the CYP youth, allowed the participants to show off their prompt skills, and the audience (peers) chose the winners from each head-to-head match. The person who won the most matches by the end of each battle won a prize (but everyone got something).

second prompt battle
The second prompt battle.
prompt battle slide
Example of a head-to-head prompt battle

We held two prompt battles, one on Tuesday afternoon and another on Thursday afternoon. We used these events to measure how much progress the participants were making with their prompt engineering. We were looking for legibility, breadth, scope, and clarity in their written prompts. Students learned that, above and beyond the written prompts, the participants with the best modifiers or “boosters” often won their matches. People can’t simply rely on the GenAI tool alone.

Coding with ML5.js

Some of the CYP youth were very interested in coding beyond Scratch/face sensing, so we explored ml5.js, an open source library with a goal of making machine learning approachable for artists, creative coders, and students. The “ml” in ml5.js stands for machine learning, a subset of artificial intelligence, and “js” stands for Javascript, which is a programming language and core technology of the Web, alongside HTML and CSS. A key feature of ml5.js is its ability to run pre-trained machine learning models for web-based interaction. These models can classify images, identify body poses, recognize facial landmarks, hand positions, and more. Students were given an option to use these models as they were, or as a starting point for further learning along with ml5.js’s neural network module which enables training their own models.

carnival ai movement app
Movement Painting with the Carnival AI app.

In addition to learning how to code with mL5.js, students learned how p5.js, a JavaScript library for creative coding, is used to make interactive visuals in web browsers. They used the PoseNet Sketchbook and Carnival AI app to explore pose estimation or PoseNet, which estimates the 2D position or spatial location of human body key points from visuals such as images and videos. The Carnival AI app celebrates and engages movement and dance in and from Black and Caribbean communities and creates visual art. The youth could also use p5.js to code using PoseNet.

AI Design Thinking & Final Pitch Projects

To help students brainstorm their own GenAI projects we used AI Ideation Cards created with input from the AIxDesign community. These cards help designers and others leverage AI capabilities, including 7 categories with questions (prompts), definitions, and example use cases. After coming up with a topic, each student was instructed to select a category that best matched their idea. Students imagined AI capabilities as superpowers and thought about how GenAI could help solve their challenges in new, unique, or better ways. Students were given Post-It notes, worksheets, and index cards to work out their ideas. They also used Gen AI tools to visualize and plan their final projects.

context awareness scenario
Context Awareness AI scenario card and example project.
Student's final pitch
Student presents final project pitch.

Near the end of the week, students worked individually or in groups on their final pitches. Examples include healthcare: providing a comprehensive pain assessment that enhances diagnostic accuracy and personalized treatment plans; food: using personality quizzes and surveys to provide appropriate recipes; and music: giving everyone a chance to share their voices with the world using a high-quality recording tool. Students presented their project pitches on the last day.

Please Note: This work has been made possible through collaboration with Cambridge Youth Programs and Lesley University’s College of Art & Design’s PreCollege Program and the generous support of the STAR Initiative.

Posted on

Girls Make Games

YodaAbove: a 5th grade coder mentors a 1st grader using Scratch Jr. 

A group of nine Kennedy-Longfellow students spent a weekend in November coding as part of the first ever Boston-based “Girls Who Make Games” workshops at MIT. These students immersed themselves in game design, art, and programming using software called Stencyl.

Not only did they get to network with other local girls who code, they were mentored by local professional developers. We sat down with these students and asked them to share their thoughts on what makes them tick as programmers at K-Lo, and beyond. To view the video, click here.

Posted on

KIBO Brings Robotics Alive for JK-2nd Grades

Code

This fall, K-Lo launched robotics in the primary grades in partnership with Tufts University’s Eliot Pearson Department of Child Study and Human Development. The school piloted the KIBO, a robot designed with support from the National Science Foundation, with one kindergarten, first, and second grade class. During the seven week project, students learned about programming, sensors, and the engineering design process with ties to readers theatre in the kindergarten, and earth science in the first and second grades.  

The Lesley team will continue to support the other primary classrooms in learning about the KIBO and how it can be used to deepen understanding and engagement in other curricular activities. To see more about the KIBO, you can see the company discussion of this innovation, and stay tuned to hear more from K-Lo students about their own experiences with robotics!

Check out a couple videos of students presenting their final projects:

Posted on

4th Graders Gone Green

Meet KLO’s first paperless classroom! Karla Anderson’s 4th grade class has jumped into the paperless fray with digital notebooks for science and math. Ms. Anderson walked the class through setting up individualized math and science notebooks in the Notability app. Notability was chosen over other apps because of the following features:

  • Creation of separate, color-coded notebooks.
  • Ability to easily sync with student and teacher district Google doc accounts.
  • Ability to import PDFs and images and draw on top of them.

To share assignments with students, Ms. Anderson simply places a worksheet in a Google docs folder shared with all of her students. Students log into the Google Drive app on their iPads and open the worksheet right into Notability, where they can either type or draw their answers. When finished with the assignment, they “share” the worksheet back into their Google Drive so Ms. Anderson can review, comment and grade the work.

The app Book Creator has long been a favorite KLO app for students to create their own eBooks. Ms. Anderson has found an innovative use for it thanks to Tim Harkins (@mrtharkins), 2nd grade teacher at West Elementary School in Andover, MA, who presented his science eBook idea at the 2013 fall MassCUE Conference. Ms. Anderson has created eBook “texts” for each science curriculum unit. The eBooks are shared out with students via Google Drive. Each student then brings the ePub into Book Creator on their iPad and plug in their answers, photos and videos as the unit progresses. Each text also included pre and post assessments so that students and teacher could see the measure of student growth nested conveniently among the content.

We asked two students to share their experience using the eBook texts as well as to give a brief tutorial on how to use the Notability and Book Creator apps:

Ms. Anderson will then take you deep into her Google Drive process to demonstrate her paperless workflow:

Posted on

The Confidence to Create

Yodamael St. Rose isn’t your typical 4th grader. In the past year she has directed her own movie, programmed interactive games in Scratch, and created a tutorial for other students on how to connect and use the school’s Raspberry Pi (the “3P K-LO”) computer packs at home. We asked Yoda to share what kind of tools and projects give her the confidence to create and here is what she shared with us:

Posted on

3-2-1, Action!

When asked how they would like to present their informational texts on habitats, a group of third grade students unanimously chimed, “the green screen!” The project began with each student identifying a habitat they wanted to study. Research started in the classroom and school library, where students read habitat-specific books and narrowed in on an animal of choice. The group recorded notes in their science journals, distilling their reading into informational bits related to each animal’s environment, prey, and physical characteristics. We then discussed the idea of an essential question to build a deeper focus on one aspect of their animal. It was agreed across the board that everyone was interested in finding out how each animal survives in their respective habitat, but students felt they needed to do more research to explore this question, so we moved onto the iPads where they could access Newsela.com and Kids InfoBits. Newsela.com is a website that produces Lexile-leveled articles on current events neatly organized into categories such as: science, kids, money, law, etc. Students also accessed Kids InfoBits, which is a student friendly database that houses a wealth of text, images and videos. (See student tutorial below.)

Together with the teachers, students chose articles closely related to their animal/environment and set about close readings of the text to hone in on the main themes. After another group check in, it was then determined that all of the students were intrigued by the idea of how climate change was affecting their chosen animal, and essential questions were altered to address this specific interest. It was also agreed that they would like to share their research with their peers by creating public service announcement (PSA) videos using the Makerspace green screen, with one student choosing to make a stop-motion animation using the iPad app myCreate. Once students felt their research was complete, we watched examples of PSAs online to get a sense of how they are crafted to share a strong message with the audience. Inspired by the possibilities, they eagerly jumped into storyboarding, which included writing a script, with specific dialogue for each scene. Students conferenced with a teacher one-on-one to choose corresponding images, music and/or sound effects for each scene. We made sure to use only images and music that were free under public domain.

Students took turns filming their scenes in front of the green screen (simply a large green cloth backdrop!), reading their script off of the SmartBoard projection. The one student doing the stop motion created a backdrop and characters out of felt, and using myCreate on her iPad, took a still image of each different movement of her characters that were automatically combined into one timeline video. Lastly, the green screen and myCreate footage were brought into iMovie, where students worked with teachers to add in images, music, or narration to their video. The final products shared out with third grade students and families to much applause.

Student videos:

Student tutorial on safe research sites:

Posted on

Junior K uses Augmented Reality to Search for Gingerbread Men! (JK)

GB2_edited-1

Junior Kindergarten (JK) students at Kennedy-Longfellow school solved the mystery of the missing gingerbread cookies with some help from an augmented reality iPad app called Layar. JK teacher Siobhan Patterson developed the interactive scavenger hunt, based on the Gingerbread Man book, which included gingerbread cookie paper cut outs placed with different staff throughout the school. Audio clues were prerecorded on the computer using Garageband and pitched up to mask the teacher’s voice, as well as sound more “gingerbready.” The recorded clues were cued by holding the iPad in front of each gingerbread cut out with the Layar app interface open. When the Layar app recognized the image, it triggered an overlay of an audio recording (it could be video, image — or whatever you want) and the students pressed the play button to hear the clue.

Once the group solved each of the clue “riddles” they were excitedly off to the next location! Ms. Patterson found that it helped to hold up the iPad as close as possible as well as make sure there was enough light for the Layar app to recognize the trigger images. Teachers at KLO are excited about the possibilities with Layar, especially experimenting with using it to support language acquisition.

 

Posted on

Reading Strategies Video

Ms. Hanna’s first graders at the Kennedy-Longfellow School decided to make a video to share the reading strategies they are learning at school with their families. The class began by a whole brainstorming session to write song lyrics based on the 5 reading strategies they had been learning in class:

  1. Look at the pictures.
  2. Make the first sound.
  3. Read it again.
  4. Look for a chunk you know.
  5. Think about the story.

Once the song was complete, they broke up into groups based on the five strategies. Each group devised a visual prop or idea for their scene and shared it back with the class for fine tuning. Finally, the moment the class had been waiting for: shooting the video! Everyone was feeling pretty energetic, so we started with the singing and dance scenes first. Students memorized the strategy song and sang/danced their hearts out in their classroom and in the library while the teacher videotaped the show. Then each group took turns having their scene videotaped (this took several takes each until the group agreed on a good shot). Finally, the teacher dove into editing the movie using iMovie on her laptop, including adding captions for each scene. The group was ecstatic with the final result and couldn’t wait to share it with their families. Here is the first grade Reading Strategies video:

Posted on

Scratch & Programming Club (2nd-5th)

Students in the after school Scratch Club have been using the MIT developed software, Scratch, to program their own animations and video games. Scratch is a kid-friendly tool based on color-coded programming blocks that snap together to create unique scripts, or codes. Since September, students have created interactive video games, multi-stage mazes and animations. We will also be using 2 LEGO We-Do robotics kits and the invention kit, MaKey MaKey, which uses alligator clips and USB to create interactive programs between objects and a computer. The Scratch Club is very excited to be presenting their projects at the 2013 Lesley Community of Scholars Day on March 27th.

Scratch Projects showcased at the Lesley University 2013 Community of Scholars event

ScratchHelp2

Posted on

Collective Compost eBook (2nd)

Ms. Dillon’s second grade class studied soil and compost during the month of October. After reading books, observing the KLO garden and collecting leaves and cuttings, each student created their own compost baggie, complete with worms. Each week, the bags were taken out for observation, with students recording their hypotheses, questions, and findings in their science journals through writing and illustrations. The project was documented using the iPad camera to take snapshots of the process and the recording app to record video of student questions and reflections. Images and video were then combined into Book Creator, with the final eBook being shared with all students and families.

Screenshot from the eBook:

DoWormsDrink