Exploring skeletal meshes was a lot of fun. I’m still getting the hang of the Unreal workflow but the ability to start playing around with animations and simulated gravity immediately in a third person environment is amazing.
I added two different animations that I thought could work together. I imported the old man smoking a cigar from Mixamo and the “swing to land” animation. It almost looks like Spiderman swinging down to hit a villain. I’d love to create a trigger for the animations through the main character colliding with an object or something like that, so it’s not just a repeating cycle.
Things I want to keep exploring are changing the environment and customizing the world and controls more.
The endangered animal I chose to work with is the Hawksbill Turtle. The Hawksbill Turtle is most threatened by the illegal wildlife trade. Their beautiful tortoise shell is used in many products such as jewelry and ornaments. They are also affected by the lack of nesting and feeding habitats. They are an integral part of maintaining the health of coral reefs, and are also affected by pollution such as the plastic fork I used in the design. I definitely need to do some more collecting of recycling materials for future prototypes because I didn’t have much to work with.
I started with a Starbucks coffee box that looked almost like it had fins already. I had the plastic fork from takeout, as well as six small plastic germination domes used for an automatic planter. I like that the dome was shaped almost like a turtle shell, although it turned out to be too small and rigid to manipulate. I started to sketch and formulate how I was going to design the turtle.
I was able to bend one of the germination domes in half to create the narrow beak that the Hawksbill Turtle is also known for.
I used larger pieces of the fork to create the turtle’s eyes.
I cut out the fins in the most natural way I could with the cardboard and scissors.
While turtles cannot fly, I enjoyed seeing my creation in the air as if it was suspended in water. I hope these beautiful creatures stick around for a long time to come.
The two online avatar systems I chose were the South Park Studios online creator and the Pop Yourself avatar creator on Funko.com.
They were both very fun, with the South Park one having the more offensive and weirder options while the funko product was cute and funny. It really is a difficult task to set up a successful avatar creator. How can we create a tool with enough agency to truly represent anyone who comes across it? It is quite an impossible task and makes me re-examine what an avatar truly is. As the PBS video “Controlling vs. “Being” your Avatar” brings up, are we creating a character or a true representation of ourselves? It makes me think of the video games like NBA 2K and others that allow you to take a photo of yourself so they can wrap those pixels around your avatar. That would be a better example of a truer representation of self. But what are the desired outcomes for your avatar? In what world will it live? The South Park avatar is very successful because all of the options are within the language of South Park characters. It was like re-living an episode of South Park choosing through all the different options available. There was enough abstraction that I did not feel it resembled me at all. Instead I chose things like the “New Jersey” skin tone which was a jab at bad New Jersey spray tans, as an Italian from New Jersey I thought it was funny. This kind of character building I find more interesting.
The Funko avatar creator also had quite a bit of abstraction as there weren’t that many options to choose from. I chose the 3D goggles as a student learning 3D environments and being able to add the cat was a nice touch. I believe the avatar creators with fewer options must allow the user to buy into the environment in order to care about the characters they are creating. South Park was definitely the more successful builder for this reason.
Are we more empathic towards avatars that look like us? Is this a tactic used for engagement by game designers? In what context is that ethical or unethical?
I am continuing to work around Chris Wiles’ design for a mask with a snap on filter. The rubber I wanted did not come in time so instead I decided to use Meshmixer to select the outer ring and create the same effect using the PLA design. It gave the mask a more aggressive look.
I created a photo series to document the time we are in and to use as a timestamp. These masks were created for doctors and front line workers as emergency backups, resorting to HVAC filters to breathe through.
I chose black and white as my canvas to show the isolating nature of quarantine. I used the mono-light function to achieve this effect.
In light of the loss of fabrication resources I will be scaling down my project physically and focusing more on the design of a smaller prototype. To facilitate this I built a neotrellis m4 express kit. This is a common development board with drumpad made by Adafruit. Using this will allow me to take advantages of the work that has been done by Adafruit to quickly built different prototypes and experiment.
Using some of the example sketches on the Trellis, there are RGB LED’s in each pad as well as programmable notes.
Their MIDIUSB sketch was actually very similar to what I was looking to do and served as a great test for many of the concepts for my design.
It uses a built-in accelerometer as well as midi notes to trigger sounds and effects. The X axis changes pitch bend up and down 1 semitone. The Y axis changes the modulation filter. Those two filters are the most common in most MIDI devices but I still often find the dials not very intuitive. Using the accelerometer felt more natural and fun. I will say that because of the small form factor of the trellis while the accelerometer of the movement felt natural, it then restricts you to two notes at a time. I want it to feel less like a gameboy and more like an expressive instrument.
Upside of using Trellis as development board:
Open source, many configurations to get inspired by, and a very fast board. This Trellis has a cortex m4 chip and built in micro usb ready to go. I can also use circuit python to code in python instead of using Arduino IDE because of the extra RAM on the board.
Downside of using Trellis as development board:
Limited mostly to using only Adafruit libraries, but this is also an upside as their documentation is quite good. The largest challenge I have is two fold- I want my final design in the future to be almost twice as big. The Trellis, while cute, is a little to small for a serious user. Also the accelerometer is better used while multiple notes can be played comfortably.
I will be working on restricting the movement of the trellis board by mounting it in an a 3D printed enclosure with springs. I want to mount the joystick for the pitch and modulation input as well. I will continue to tune the settings and get the most intuitive experience I can out of it. I will be presenting a song demo using the enclosed board.
Arduino vs Circuit Python: I have been coding in Arduino but I do have experience with Python and will likely give it a try this week.
3D Printing development:
The trellis has many online open source files and different enclosures I can use as starting points for my design.
Fully 3D printed enclosure? Or Lasercut plus acryllic housing?
The midi controller will engage pitch slides based on user input in two directions.
The two options for this are a joystick connected to board or an accelerometer & gyroscope.
If the board is attached to the joystick well it will most likely be more accurate with consistent use.
In the video above I tested a simple chord progression with pitch slides. I want the controller to be locked in scale so looking at chords or single notes is useful as a study. These notes wobble and change together. I started to slowly move the pitch within one semitone slowly up and down. These created really interesting effects that feel like rising and falling. The more energy (acceleration) in the change of pitch within this scale almost creates its own wah effect. The wah effect and pedals, generally spike one frequency, and turning the pedal on and off creates the wah. I think this pitch bending wah effect is also a slightly more subtle way of achieving a similar result. I want to connect the tactile nature of speeding up the pitch bend- to the way we get the input in the controller.
My project does not deal with rhythm as directly. I want to create an exploratory tool, but one that is within the confines of a DAW. As we saw in class the DAW handles the time signature and BPM. I want to trigger midi and metadata for certain parameters of built in instruments/effects. Last week I was testing the automation of pitch using the plug-in Alterboy by Soundtoys. Something I discovered was that any pitch alteration beyond around .2 semitones starts to sound off, especially in an arpeggiator. Therefore I want my tool to constrain wobble effects to smaller amounts. I also discovered that slides sound really nice but have to be done quickly.
Create a movement interaction for wobble
Create a movement interaction for slides
I’m really inspired by the work with the accelerometer in this project by Amanda Ghassaei.
Further developing the idea/imagining it in different scales.
Currently I want to make a midi controller that illuminates the relationship between pitches and water. In practical terms this will look be a device for adjusting and exploring the entire pitch spectrum. I want to do it in a way that can still be used in the music production pipeline, so I will be locking scale/key into the process.
Two big inspirations are the Roli Seaboard and the RC-20 plugin. The Roli seaboard gives access to all of the pitches outside of the traditional piano structure. When playing a note the user can move up and down to explore the pitches in between notes. This video also demonstrates how that sounds when playing a chord. I want to explore similar principles, but away from the piano. I want the sounds to feel like an ocean, explored through controlled movement.
The RC-20 plugin also demonstrates how exact pitches have their time and place, but often times our ear associates slight wobble, with a vintage texture. This is because analog devices like tape and vinyl have texture to them like a pitch wobble and added noise.
The things I want to experiment with most:
—Pitch modulation as exploration
–Added noise for texture
–Locking melodic scale while allowing for interesting/unique compositions
–Getting the wah effect of a guitar string
Practical interfaces I can utilize:
I’m really inspired by this project from Amanda Ghassaei where she uses an accelerometer and gyroscope to change notes.
Why not ride the audio wave with a surfboard? What if that surfboard had an accelerometer and gyro attached?
Let’s imagine the project in three scales/formats
The controller is a literal surfboard where the user modulates pitch with their movement and angle. There are hand controllers to change notes.
If audio that has been printed on tape is suspended in the air, the user can pull it to adjust the sound. There is a direct link to the hardware and audio. This isn’t music creation as much as it is performance. How can one manipulate their movements to achieve what they want from the tape. Pulling it will adjust pitch and speed. The more it is stretched and used, the harsher it will sound, adding even more noise and texture. Eventually it will completely degrade, which emphasizes the physical nature of the medium.
Can digital audio degrade naturally? Can there be a link between real physical degradation and digital audio degradation/distortion.
Online Audio Manipulator
If this project was online as a tool, you could drop an audio file into the program. There will be an advanced audio visualizer that also lets you edit. I am inspired by the spectral display function in Adobe Audition where it shows you a heat map of frequencies and let’s you manipulate it. This is mainly used to erase spikes in the frequency spectrum cause by unexpected things while recording audio.
Week 4 In Class
During class I’m going to set up an Arduino Nano with some push buttons to start prototyping the accelerometer feature/sending midi information to the computer.
To start experimenting with user paths I honed in my idea on an expressive tool for pitch modulation. This tool must also be locked into scale to make it more interesting. An artist named Blankfor.ms in using the Roli Seaboard said “There’s an ocean between Db and Eb.” This always inspired me to see what else is possible. I wanted to take this concept further and develop a responsive board to ride the ocean pitch wave.
One of the most popular plugins in modern producing is called RetroColor. A defining feature of RetroColor is its “wobble effect module” that rides the pitch like a tape or vinyl would. I believe that giving the user more expressibility in this function would be of benefit.
In the video I show the setup for my device. I want to use pads instead of notes to get away from the standard piano. Instead of the automation I want to use an accelerometer and gyroscope to respond to pitch changes while notes/chords are being played.