For the short film assignment I wanted to play around with version 4.26 of Unreal Engine. I was very excited to find a brand new environment available for free on the marketplace that I could use as my setting for experimenting. I also wanted to bring back the old man with the cigar character.
When I tried to load in the pre built worlds I was running into space issues even though everything was on my separate drive, some things were constrained to my local disk. This along with the large textures kept crashing the site. Instead I began working with an HDR background texture which looked realistic and was fun to play with and I began working in the sequencer to create some shots. I was definitely a little in over my head, trying to animate the cine cameras and I’ve realized 24 fps looks quite bad coming out of unreal. I need to keep working on creating animations and spawning events.
Exploring skeletal meshes was a lot of fun. I’m still getting the hang of the Unreal workflow but the ability to start playing around with animations and simulated gravity immediately in a third person environment is amazing.
I added two different animations that I thought could work together. I imported the old man smoking a cigar from Mixamo and the “swing to land” animation. It almost looks like Spiderman swinging down to hit a villain. I’d love to create a trigger for the animations through the main character colliding with an object or something like that, so it’s not just a repeating cycle.
Things I want to keep exploring are changing the environment and customizing the world and controls more.
The endangered animal I chose to work with is the Hawksbill Turtle. The Hawksbill Turtle is most threatened by the illegal wildlife trade. Their beautiful tortoise shell is used in many products such as jewelry and ornaments. They are also affected by the lack of nesting and feeding habitats. They are an integral part of maintaining the health of coral reefs, and are also affected by pollution such as the plastic fork I used in the design. I definitely need to do some more collecting of recycling materials for future prototypes because I didn’t have much to work with.
I started with a Starbucks coffee box that looked almost like it had fins already. I had the plastic fork from takeout, as well as six small plastic germination domes used for an automatic planter. I like that the dome was shaped almost like a turtle shell, although it turned out to be too small and rigid to manipulate. I started to sketch and formulate how I was going to design the turtle.
I was able to bend one of the germination domes in half to create the narrow beak that the Hawksbill Turtle is also known for.
I used larger pieces of the fork to create the turtle’s eyes.
I cut out the fins in the most natural way I could with the cardboard and scissors.
While turtles cannot fly, I enjoyed seeing my creation in the air as if it was suspended in water. I hope these beautiful creatures stick around for a long time to come.
The two online avatar systems I chose were the South Park Studios online creator and the Pop Yourself avatar creator on Funko.com.
They were both very fun, with the South Park one having the more offensive and weirder options while the funko product was cute and funny. It really is a difficult task to set up a successful avatar creator. How can we create a tool with enough agency to truly represent anyone who comes across it? It is quite an impossible task and makes me re-examine what an avatar truly is. As the PBS video “Controlling vs. “Being” your Avatar” brings up, are we creating a character or a true representation of ourselves? It makes me think of the video games like NBA 2K and others that allow you to take a photo of yourself so they can wrap those pixels around your avatar. That would be a better example of a truer representation of self. But what are the desired outcomes for your avatar? In what world will it live? The South Park avatar is very successful because all of the options are within the language of South Park characters. It was like re-living an episode of South Park choosing through all the different options available. There was enough abstraction that I did not feel it resembled me at all. Instead I chose things like the “New Jersey” skin tone which was a jab at bad New Jersey spray tans, as an Italian from New Jersey I thought it was funny. This kind of character building I find more interesting.
The Funko avatar creator also had quite a bit of abstraction as there weren’t that many options to choose from. I chose the 3D goggles as a student learning 3D environments and being able to add the cat was a nice touch. I believe the avatar creators with fewer options must allow the user to buy into the environment in order to care about the characters they are creating. South Park was definitely the more successful builder for this reason.
Are we more empathic towards avatars that look like us? Is this a tactic used for engagement by game designers? In what context is that ethical or unethical?
I am continuing to work around Chris Wiles’ design for a mask with a snap on filter. The rubber I wanted did not come in time so instead I decided to use Meshmixer to select the outer ring and create the same effect using the PLA design. It gave the mask a more aggressive look.
I created a photo series to document the time we are in and to use as a timestamp. These masks were created for doctors and front line workers as emergency backups, resorting to HVAC filters to breathe through.
I chose black and white as my canvas to show the isolating nature of quarantine. I used the mono-light function to achieve this effect.
In light of the loss of fabrication resources I will be scaling down my project physically and focusing more on the design of a smaller prototype. To facilitate this I built a neotrellis m4 express kit. This is a common development board with drumpad made by Adafruit. Using this will allow me to take advantages of the work that has been done by Adafruit to quickly built different prototypes and experiment.
Using some of the example sketches on the Trellis, there are RGB LED’s in each pad as well as programmable notes.
Their MIDIUSB sketch was actually very similar to what I was looking to do and served as a great test for many of the concepts for my design.
It uses a built-in accelerometer as well as midi notes to trigger sounds and effects. The X axis changes pitch bend up and down 1 semitone. The Y axis changes the modulation filter. Those two filters are the most common in most MIDI devices but I still often find the dials not very intuitive. Using the accelerometer felt more natural and fun. I will say that because of the small form factor of the trellis while the accelerometer of the movement felt natural, it then restricts you to two notes at a time. I want it to feel less like a gameboy and more like an expressive instrument.
Upside of using Trellis as development board:
Open source, many configurations to get inspired by, and a very fast board. This Trellis has a cortex m4 chip and built in micro usb ready to go. I can also use circuit python to code in python instead of using Arduino IDE because of the extra RAM on the board.
Downside of using Trellis as development board:
Limited mostly to using only Adafruit libraries, but this is also an upside as their documentation is quite good. The largest challenge I have is two fold- I want my final design in the future to be almost twice as big. The Trellis, while cute, is a little to small for a serious user. Also the accelerometer is better used while multiple notes can be played comfortably.
I will be working on restricting the movement of the trellis board by mounting it in an a 3D printed enclosure with springs. I want to mount the joystick for the pitch and modulation input as well. I will continue to tune the settings and get the most intuitive experience I can out of it. I will be presenting a song demo using the enclosed board.
Arduino vs Circuit Python: I have been coding in Arduino but I do have experience with Python and will likely give it a try this week.
3D Printing development:
The trellis has many online open source files and different enclosures I can use as starting points for my design.
Fully 3D printed enclosure? Or Lasercut plus acryllic housing?
The midi controller will engage pitch slides based on user input in two directions.
The two options for this are a joystick connected to board or an accelerometer & gyroscope.
If the board is attached to the joystick well it will most likely be more accurate with consistent use.
In the video above I tested a simple chord progression with pitch slides. I want the controller to be locked in scale so looking at chords or single notes is useful as a study. These notes wobble and change together. I started to slowly move the pitch within one semitone slowly up and down. These created really interesting effects that feel like rising and falling. The more energy (acceleration) in the change of pitch within this scale almost creates its own wah effect. The wah effect and pedals, generally spike one frequency, and turning the pedal on and off creates the wah. I think this pitch bending wah effect is also a slightly more subtle way of achieving a similar result. I want to connect the tactile nature of speeding up the pitch bend- to the way we get the input in the controller.
My project does not deal with rhythm as directly. I want to create an exploratory tool, but one that is within the confines of a DAW. As we saw in class the DAW handles the time signature and BPM. I want to trigger midi and metadata for certain parameters of built in instruments/effects. Last week I was testing the automation of pitch using the plug-in Alterboy by Soundtoys. Something I discovered was that any pitch alteration beyond around .2 semitones starts to sound off, especially in an arpeggiator. Therefore I want my tool to constrain wobble effects to smaller amounts. I also discovered that slides sound really nice but have to be done quickly.
Create a movement interaction for wobble
Create a movement interaction for slides
I’m really inspired by the work with the accelerometer in this project by Amanda Ghassaei.