I am a software engineer living in Baltimore, MD. I strive to deliver reliable and beautiful code. I develop for embedded systems, mobile apps, and everything in between. I love to create, build and learn. My other other passions are photography, filmmaking, and traveling. Check out some of my notable projects below.
The goal of the Abilities Hackathon is to use technology to solve and address problems for those with disabilities or challenges. Our group learned that some Deaf people may prefer to view someone signing instead of reading long passages of text in English (such as an online news article). This is because English is likely a second language to them while American Sign Language (ASL) is their first language, and therefore more natural to them. Our group decided to build a "text to ASL" interpreter. A browser plugin would allow the user to highlight any text on a web page, right click on it, and select the option to "sign this". The plugin would then display a video of an interpreter signing the text that had been selected in ASL. The translating and video creation was done automatically by our server. The server would parse the provided text one word at a time and search an online ASL dictionary for each word. The online dictionary would provide us with short videos of individual words in ASL. We would then stitch together all of the individual videos to recreate the sentence(s) that had been selected, and then display that final video back to the user. We also made a mobile Android app that worked in a similar fashion. Pretty simple, right? Well, while it looked cool, we learned that it wasn't going to be very practical yet. We talked to multiple ASL interpreters who were quick to point out that translating English to ASL was not that simple. They explained that ASL has a completely different sentence structure compared to English, so word order and grammar is very different. Clearly our naive word-to-word translation would not be enough. We didn't have the time or the expertise to handle all of that. To make this tool actually useful would take much longer than a weekend. Nonetheless, we felt good about the foundation we had laid for future work in this area. We had a lot of fun working with each other, and learned a lot of about ASL, Deaf culture, and language processing. Handy was highlighted as a Staff Pick on Devpost.com.
Sin City Sweets is a family owned small business in Las Vegas, NV who makes delicious chocolate covered nuts and fruits. I designed and deployed their website. The front-end was developed using Bootstrap framework.
At a local hackathon hosted by the Digital Harbor Foundation, I built a mini "television" that could play animated gifs for endless entertainment. I used a Raspberry Pi, a wifi module, and a 3.5" LCD touchscreen to control the device. I also designed a custom case, and printed it using a rapid prototyping 3D printer. This was my first time printing something that I designed and modeled myself. I modeled my device after a vintage-style TV with "rabbit ear" antenna and volume/channel knobs, but more colorful.
Our film production group created this short film for the 29 Days Later Film Project (2015). We were given 29 days to write, shoot and edit an orignal short film. Our submission, Unconventional Love, won the award for best acting, and took 2nd Place overall! My primary contributions during this production was writing, cinematography and lighting.
Our film team was recruited to produce a Kickstarter campaign video to promote a new fairy tale novel called The Dagger and the Rose, written by Bill Hoard and illustrated by Leah Morrison. The video differentiates itself by sharing the author's imagination with the audience using visual cues as he walks through the park. My primary role during this production was directing, writing, cinematography, and camera operator.
At the 2014 Baltimore Hackathon, our team built a self-navigating robot. The robot was powered by a Raspberry Pi. The chassis and wheels were custom-made using a CNC laser cutter. Two motors from power screwdrivers were re-purposed to drive the wheels. The robot used a cheap webcam and image recognition software (OpenCV) to locate two unique targets located at the center of the navigatable space. Using some triangulation techniques, the robot can calculate its position relative to the two targets, and update it's path around the area accordingly. This concept could be used in products such as a self-driving lawn mower. Unfortunately, we were unable to finish the project in the time we had. All of the components were nearly complete, but we ran out of time while trying to integrate and test them all together. Despite that, our efforts won us a prize from an event sponsor.
A group of friends and I created this short film for the 48 Hour Film Project in Baltimore. The entire film was written, shot and edited in just 48 hours! Our selected genre was "Time Travel". It's not perfect, but we got a lot of good laughs from the audience when it was screened in theater a week later! Of the 40+ film submissions, our film won awards for Best Writing, Best Use of Prop, and an Audience Favorite award. My primary contributions on this project was cinematography, camera operator, and assisting with writing.
At the Walters Art Museum's annual Art Bytes hackathon, our team set out to recreate a full scale replica of a well-known piece of Baltimore art: Giuseppe Ceracchi's "Bust of George Washington". With the help of the Direct Dimensions, we were able to take an impressive micron-resolution 3D scan of the statue. Printing the full scale 3D statue would have required professional equipment, and a lot of time and money. So instead, we crowdsourced the effort. Hobbyists around the world own desktop printers that can print small scale objects. Using 3D modeling software, we sliced the full scale model into 110 unique blocks that home printers would be able to print at home. We created a website that allowed contributors to join our project, download a piece of the model, print it at home, and mail it back to us for re-assembly. The website was live and functional by the end of the hackathon, and a few weeks later our 3D printed replica of George Washington was complete and on display at the 2014 3D Print Show in NYC. By distributing and crowdsourcing the effort, the maker community was able to build something special together.
Photography is a huge passion of mine. To grow as a photographer, I challenged myself to take at least one photo a day for an entire year. No themes or bounds. Just find something interesting in my daily life, and capture it. I completed the project using my smartphone and sharing the photos on Instagram. I finished the project in April, 2013.
At a local hackathon hosted by Betamore, we were challenged to "power the future". Our idea was create a cheap and wirelessly controllerable air vent, or "smart vent". If a homeowner replaced all of the air vents in their home with smart vents, he or she could have much more grainular control over the air flow in each room of their home. For example, the system could be programmed to close off vents to certain rooms of the house when not needed, such as the kitchen in the middle of the night. This would allow for better temperature control throughout the house and better energy efficency. We were able to build a proof-of-concept for our idea during the hackathon weekend. We attached a cheap servo motor to a standard home vent that could open and close the vent. We then used an arduino microcontroller to control the motor and accept commands from a PC acting as a server. A mobile app was created that could ask the user what target temperature they would like. The mobile app would communicate with the server over wifi, and the server would determine if the vent should be open or closed based on current temperature conditions. Our project won 1st place in the hackathon. My primary role was developing the android app for the end user.
A group of friends and I entered a online short film contest. Part of the contest involved using an unfinished script. We were challenged to finish the given script and create a unique story out of it. We wrote the remainder of the story and a screenplay, and then shot the film with a Nikon D7000, and a Canon t4i. The entire film was shot and edited in a week. My primary role was cinematography. This was our first film production together.