Uncategorized

What are they doing when they are not making stuff? They are making stuff at Hackathons!

August 15, 2016

Two hackers from Wingify attended the ITC 2016 Hackathon. This is their blog about their experience. 

I attended ITC iTech 2015 and had a pretty good experience (and food 😛 ). The theme was Internet of Things and they gave away a Raspberry Pi to each team. Also, they clearly mentioned that the idea entirely belongs to the team (unlike the Bosch Hackathon, which made us to sign an agreement paper that the idea entirely belongs to them and they weren’t even giving away the kit). 

As expected ITC announced their second annual hackathon – iTech 2016. This year the theme was to build VR/AR prototype/product. Little disappointed with the topic, as the scope is much into software, we started braining storming ideas where VR and AR can be integrated with any hardware product.

After a few brainstorming sessions with Sanjay and two of our friends in Chennai, we finalized OmniPresence Robot as final idea to submit. We took a short video explaining our idea and registered for the hackathon. Few days before the hackathon we received a reply from ITC about our selection for the hackathon along with 40 other teams. 

The idea of the OmniPresence Robot is to mount a camera on the top of the robot. It should be streaming the video feed to the VR headset over wifi and as the user who is wearing the VR headset moves his head, the camera will move on the same direction. Since the video is on the VR headset, it will give an immersive feel and the user will see what the robot see. In an ideal case, the robot can be placed at any remote location with good internet access and the user can sit at his home and pan around the area. To enhanced the experience further, a kinect camera will be kept at the front of the user and as the user moves, the robot will move in the same direction. So the camera in the Omnipresence robot will act as the eyes and the motors will act legs. The control signals are captured based on the movement of head (for the camera) and movement of the body (for the wheels).

On the day of hackathon, the bus reached 4.5 hours late. Sanjay and I reached the campus by 11.45pm but two of our friends already registered for the team. The base of the robot was build in Chennai Makerspace and it has been used for various purposes. Sibi and Karthick, our friends from chennai brought the bot base and they were early. So we had no problem with the registration. The first thing to do with the hackathon was to pitch the idea to the jury panel (which wasn’t there in the previous edition). After that the hackathon started. The hackspace was familiar for me since it was the exact same place as the last year. The place is little congested but it was manageable. You get unlimited coffee, cool drinks, biscuits and chips (limited to ITC products :P).

We divided our work into two parts. Karthick and I started to work on camera motor control and Sibi and Sanjay started to work on wheel control based on body movement. We used HyperIMU app to understand how different sensors on the mobile phone works. It has a very neat features to set the sample rate and UDP protocols. We planned to use Intel Galileo board instead of Raspberry Pi since the logic level of Galileo is 5V which is required for the motor driver. For some reason Galileo stopped working in the while connecting wires. The same happened in the previous hackathon for my friends. Finally we ended up using the Raspberry Pi with logic level converter. We used Gyroscope’s Y axis value to calculate the Yaw of the head. Initially we were trying to calculate the slope (differentiating) and peak to calculate the speed and angle of rotation. Later we realized that the Gyroscope’s reading are angular velocity readings and differentiating it would simply gives acceleration not angular displacement. So we integrated the readings to get the angular displacement. The speed of the head rotation and camera motor speed were not in exact sync since there were some delay and losses in sending the data. There were some drifting (like a steady state error) but that didn’t affect the experience much. This was around 3 o clock I guess. By the same time Sanjay and Sibi almost finished the body tracking with kinect and it was great. Stepping one step forward will move the robot forward.

We had a good lunch the next day. And then starts the judging and so the unexpected problems. Suddenly the smoothness in the camera motor was not there when people started using it. Also we were sitting near the window and the sunlight started affecting the kinect. As a bonus to all these problem, people at the start were moving a lot and the kinect started tracking all of them. As a rule of thumb, we didn’t edit the code to correct these problem, but we tried to fix it by avoiding these disturbance (and we couldn’t). We demoed the prototype to the jury panel and it went ok.

After that they were having startup showdown before announcing the result. It was lengthy and (boring) so we played foosball and slept most of the time. And then the results were announced. They selected top 5 teams to present their ideas on the stage (we weren’t one of them). After that they announced that they will be giving away 25k for some teams which had good ideas. And the first team was us. Some money to compensate the travel cost 😛 They were also giving away goodies like last year but little less this time. Sure, I will attend the hackathon next year. Still one of the best hackathons.

    Leave a Reply