yosun.me: yosun is me » Code http://yosun.me a diary. maybe. Tue, 30 Sep 2014 22:07:10 +0000 en hourly 1 http://wordpress.org/?v=3.1 Dreamforce Hackathon Diary-in-Hindsight http://yosun.me/2011/09/04/dreamforce-hackathon-diary-in-hindsight/ http://yosun.me/2011/09/04/dreamforce-hackathon-diary-in-hindsight/#comments Mon, 05 Sep 2011 04:12:17 +0000 yosun http://yosun.me/?p=151 So, last week (Mon thru Weds), I spent the better part of each day participating in the Dreamforce Hackathon 2011 #DFHack – the result of three days’ work on creating SocialVoxels was a nice $5000 prize via American Express Gift Cards (which are actually an oddity to use, being not accepted in a number of places I usually shop at). I’m typically the weird one out presenting the app that’s just odd — yet, eerily on-the-mark for the event. A couple of people have asked me how I came up with the idea, and what the whole process was like – so here goes:

Initial Ideas Stage / Sunday

I found out about #DFHack late, so I didn’t start my brainstorming and planning process until Saturday/Sunday. For this initial stage, I typically work in phases, with raw “first blush” ideas, then read-the-rules (and judging criteria), then idea filtering, last-round ideas, then final filtering. This whole process is typically done in parallel, so to speak, as I tackle weekend errands and such – and get inspired by these usually dull events. Checking into Costco on Foursquare, scrolling through the usual bunch of comments, transportation, the works. There are generally tons of tips for every popular venue, just as there are tons of comments for every popular post online. There’s lots of people out there. But, people’s comments get lost in a sea of text. There’s a problem with metadata being analyzed in a totally anonymous or stats-only way. There’s also a problem with people trying to find what they need in this mess of metadata. Good stuff gets lost – and the more social stuff we have, the more stuff goes up there, and the more of that gets lost.

I had two ideas left in “final filtering” –

  1. a graph based “true relevance” search engine:

    Basically, “what you need, when you need it… through the social graph.” People update certain need-to-know status’s such as I need a job, or I have to sell a Burberry scarf, or I need a room, then using the relations-based magic of a graph database, they start knowing who in their circle (and beyond) can “answer” these needs for them. Then scrapping from twitter and craigslist and such, it also expands to beyond just the people in the network. The plus of this is that InfiniteGraph is doing the graph magic… and it runs on Java, and Heroku (one of three choices for cloud tech to use to meet the hackathon requirement) is good for Java. Also, showing a graph-based demo onscreen looks cool! The minus is that I’d have to demo with a small real data set (due to… no reach!), and have to demo using a lot of bull’d data.
  2. a human-readable way to express checkin data:

    Well, this would have to be creative, since omniscience or the ken to perceive-it-all in light of infinite-data is something we mere-humans lack. I like building things in 3d – but then there’s always a limit to how much you can convey on mobile. And then there’s whether your typical user would get it. For the most part, everyone gets Lego block’s, even if they might not totally like them. And user data as consciously contributed by the user is almost always lost in a sea of text – it’s the bane of letting *everyone* post. Human eye’s were not designed for reading, per se – if you believe in evolution, eyes are really just for visual stuff – like, there’s a huge cave there, don’t run smack into it! So, what if, everyone user could contribute a block, tied to their metadata (of course), to create a new virtual “statistic” of a venue. Looking at the Dreamforce agenda, especially that whole iPad Executive thing, it seems that mobile tablet’s are totally in this year, although I could show that awesome self-searching graph from the first idea on the iPad as well.

The first idea was on my mind largely because of InfiniteGraph 2.0, which came out the week before at NoSQL Conference. I thought about the last bit and thought that it’d be neat to be able to display something other than the usual chartsy stuff based on statistics, a data type, that – since conception – is human readable, and not just some shady metadata “collected anonymously for analytics purposes only”. And then, I thought about how I could kill two birds with one stone with the InfiniteGraph Hackathon, too… I was really tempted to do the graph thing, but then I thought back to the whole Semantic Web movement, and how insignificant the graph hack stuff is in light of that (and facebook and linkedin graph’s and such) – and especially since I’d be basically depending too hugely on the db to make the local associations and such. It would have been almost like going to a barrista competition with a “secret sauce” of having Starbucks ready to cater.. Rules…

Always keep the rules – three main criteria in mind…

Originality/Innovation – 25%

The submission creates an innovative and original solution to a known problem or creates a new market opportunity. Innovation isn’t strictly about ideas, the submission also executes on the innovative idea

When you go to a cloud conference, the thing that EVERYONE has is data. Lots of data. Most of it junk or not-easily-made-useful. Much of it ephemeral. Poof, and then it’s no use. And then it’s left for the analytics folks to do magic with Hadoop and such. The point being, most of the data collected was never really meant for human eyes to be able to comprehend, which is why stuff like Hadoop is da bomb (kind of like quantum physics – you need a uber-scope like a SEM to be able to “see” any of it – if you call that seeing).

What if there were a data type (other than photo’s) that was intrinsically conveyed for human eyes? It’s beyond the usual metadata, for sure. Hmm but how could this be useful data… Well, make it fun, at least…

Add in some sound — in-line with my other phase with uber-accessibility and pipedream of letting blind people “see” on eyes-dependent touchscreen smartphone’s. Assign colors to musical note’s, and create an un-changeable – the order of the blocks played according to each note as color.

Effective Use of Cloud Tech – 25%

The submission utilizes all available cloud technology to it’s full potential and avoids using on premise solutions where possible

Lots of data needs cloud storage.

Relevance to the DF11 “Social Enterprise” theme – 25%

The submission is relevant to the Hackathon Social Enterprise theme. It’s Mobile, Social, or Open.

Totally, all three! But, more on the Social Enterprise:

  1. true relevancy via graph – imagine HR being able to instantly recruit that one employee to change the world
  2. lego-ize the world – a new social metric! – “a new social enterprise – imagine the millions of people out there… each contributing a voxel at a venue”

Judges Proclivity – 25%

Is the submission a high quality submission? Would you use this application?

I’ve been lucky enough to attend enough startup pitch’s in front of similar audience type’s to have a good feel for the last one (I’m generally 9/10 in-agreement with the outcome – there’s always the odd one, where the judges end up picking the pansy — whhh-why?).

So, to answer the last one, my answer was – yes, I would *so* love to do Minecraft in the real world! Yes, there was a risk of business-type’s not “getting” what I was doing, wherein .

The Meat of the Hack

The way I do hackathon’s is that I go there with a solid idea, and then, when I’m finally there, I start building it from scratch (with the help of frameworks, libraries, the expected high level programming and scripting languages, and such generica, of course). It’s a purist’s perspective — and it’s also aptly scrappy: if it doesn’t turn out well, you’ve lost the least amount of time. (And, you’ve had another chance to experience the zen in that state of chaos in your head on “the brinks of exhaustion”. Truly, this boundary between consciousness and unconsciousness is an awesome state to milk for ideas that could change the world. Or gem’s that you can use to prove yourself insane to yourself!)

On Monday, I started setting down the foundations of the core engine. This was “architecture in a day.” (And, yes, this explains for why 90% of all hackathon app’s need to be rewritten. Design, in the architecture sense, is the most important element for the long-term stability of an app – and it should take week’s, not hours to do.) Part of the day was LucidChart flowchart’s, UML diagram’s, layout sketches, and Todoist for tasklisting. The other part of the day was actually coding the 3d engine for adding the block’s – which didn’t take long at all, thanks to Unity. At the end of the day, the app was wrapper’d up with data ready to be i/o’ed to cloud storage.

On Tuesday and part of Wednesday, I focused on integrating a bunch of API’s and i/o-ing data from the cloud. I also tested my iPad 2 out a bit, on the sort of GPS. I phased in and out of different Dreamforce session’s, attended the evening events – a few reception’s and parties in the usual SF haunt’s, and also, the Metallica concert!

By Thursday, I was in zombie mode, both for having worked quasi-intensely through most of the day and attending a bunch of evening events (averaging about 3 hours of sleep a night).

And then the part I’d spent most of Thursday preparing for — pitching. I’m not good at public speaking or pitching, but well, I guess, in the worst case scenario, if nothing happens, I get another chance to practice pitching. It went okay, as, I guess, some parts of my point went through. I didn’t get to demo as much of the app as I’d like (it was 2 minutes, strict) — and I didn’t even get to explain the real social enterprise significance.

So, the guy I’m TR-ing an AR book for won first place, and I won second. (No relation — we met for the first time at the event. Apress does everything too remotely, no one knows anyone on the stack.)

]]>
http://yosun.me/2011/09/04/dreamforce-hackathon-diary-in-hindsight/feed/ 2
Protected: TSM with Custom Effects http://yosun.me/2011/06/26/tsm-with-custom-effects/ http://yosun.me/2011/06/26/tsm-with-custom-effects/#comments Mon, 27 Jun 2011 07:59:48 +0000 yosun http://yosun.me/?p=110

This post is password protected. To view it please enter your password below:


]]>
http://yosun.me/2011/06/26/tsm-with-custom-effects/feed/ 0
Tap Shake Messenger from #MutherMobile http://yosun.me/2011/06/26/tap-shake-messenger-from-muthermobile/ http://yosun.me/2011/06/26/tap-shake-messenger-from-muthermobile/#comments Sun, 26 Jun 2011 22:12:33 +0000 yosun http://yosun.me/?p=101 So, it’s an Android smartphone app that opens up the usually highly-visual nature of these devices for blind and visually-impaired people using haptics. The one I demo’d has no visual UI – it’s UX via haptic braille and long/short-touch input.

I spent the last weekend at WIP’s “Muther of all Hackathon” creating what I’d originally hoped to be a “smartphone platform for the blind” called “Tap Shake Client,” but had to downgrade to “Tap Shake Messenger” due to the main feature being only messaging and reverse geocoding fetching. I generally create silly or quirky visualization 3d-graphics based app’s for hackathon’s, and it’s the first time I tried creating an app potentially useful and life-changing for an entire population. It was fun! I won a couple of prizes, including Immersion’s grand prize and AT&T’s accessibility prize. Also won prizes from deCarta and BlueVia, and I think one other, whom my sleep-deprived amnesia from the weekend has caused me to forget. (Sheepishly, I admit the unexpected deCarta prize was by far the best. Their evangelist – a fellow Southpark fan – really reached out to the inner Cartman in me with a 64 GB iPad 2!!!)

Here’s my prezi presentation, that briefly explains the need as well as the disruptive notion of an app with no visual UI – with user interaction all via haptics and temporal touchscreen (long touch, short touch, etc):

Here’s the outline of the main Braille components:

Legend:
Each braille character is composed of 6 dots. Spaces are just spaces.
Long click = filled dot
Short click = unfilled dot
Double tap = space.
Six short tap (blank) = send the SMS!

I composed a very hackish representation of the braille character set using this serialized array, with 1′s representing filled dots and 0′s representing unfilled dots (Braille is almost a case-insensitive language, as the upper and lowercases are exactly the same, but the uppercase is prefixed with an upper case symbol, denoted as C below.. and yes, numbers are not yet represented – it would have taken just about 30 secs more to type in the numbers, but tempus fugit!):

1
2
3
4
5
6
7
8
	public int totalCharSet=28;
	public char lookupChars[]={    'E' ,'a','b','c','d','e','f','g','h','i','j','k','l','m','n','o','p','q','r','s','t','u','v','w','x','y','z','C'  ,'N'  ,'.'   };
	    public int lookupDot1[]={   0  , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 0 , 0 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 0 , 0 , 1 , 1 , 0 , 1 , 1 , 1 , 0   , 0   , 0    };
	    public int lookupDot2[]={   0  , 0 , 1 , 0 , 0 , 0 , 1 , 1 , 1 , 1 , 1 , 0 , 1 , 0 , 0 , 0 , 1 , 1 , 1 , 1 , 1 , 0 , 1 , 1 , 0 , 0 , 0 , 0   , 0   , 1    };
	    public int lookupDot3[]={   0  , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 0 , 1 , 1 , 1 , 0   , 1   , 0    };
	    public int lookupDot4[]={   0  , 0 , 0 , 1 , 1 , 0 , 0 , 1 , 0 , 1 , 1 , 0 , 0 , 1 , 1 , 0 , 1 , 1 , 0 , 1 , 1 , 0 , 0 , 1 , 1 , 1 , 0 , 0   , 1   , 0    };
	    public int lookupDot5[]={   0  , 0 , 0 , 0 , 1 , 1 , 0 , 1 , 1 , 0 , 1 , 0 , 0 , 0 , 1 , 1 , 0 , 1 , 1 , 0 , 1 , 0 , 0 , 1 , 0 , 1 , 1 , 0   , 1   , 1    };
	    public int lookupDot6[]={   0  , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 1 , 1 , 1 , 1 , 1 , 1 , 1   , 1   , 1    };

I then used a combination of different parsing functions to convert input to output and the other way around.

Dots to Character

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
 
    /* given braille dots, outputs index (int)
     * @ bc[] = dots[] (braille dots) 
     */
    public int dots2index(int bc[]){
    	// given braille dots, returns index
    	int index=-1; // returns index of char, -1 if otherwise
    	for(int a=0;a<totalCharSet;a++){
    		if(bc[0]==lookupDot1[a]
    		 &&bc[1]==lookupDot2[a]
    		 &&bc[2]==lookupDot3[a]
             &&bc[3]==lookupDot4[a]    
        	 &&bc[4]==lookupDot5[a]      
             &&bc[5]==lookupDot6[a]        	                     
               ){
        			index = a;
        			return index;
                } 		
    	}
    	return index;
    }
 
    public char index2char(int index){
    	// given an index, returns character
    	return lookupChars[index];
    }
 
    public int parseNewDot(){
    	currentDotCount++;
    	if(currentDotCount>=6){
    		int index=dots2index(dots);
    		if(index!=0){
	    		playCharDots(dots); // sound off currently stored
	    		storeChar(index); // store char
	    		newBrailleChar(dots,charLength*2,0); // prints out char
	    		resetDots(); 
    		}else {
    			sendMsg();
    		}
    	}
    	return currentDotCount;
    }
 
    public boolean storeDot(int b){
    	dots[currentDotCount]=b;
    	return false;
    }

Character to Dots

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
   public void string2vibes(String s){
    	Log.i(appName,Integer.toString(vibeagain));
    	if(vibeagain==1){
	    	char[] cArray = s.toCharArray();
	    	int indexCount=0;
	    	int[] indexStored=new int[1024];
	    	for(char c:cArray){
	    		int c2i=char2index(c);
	    		indexStored[indexCount] = c2i;
	    		int[] tempDot={lookupDot1[c2i],lookupDot2[c2i],lookupDot3[c2i],lookupDot4[c2i],lookupDot5[c2i],lookupDot6[c2i]};
	    		playCharDots(tempDot);
	    	}
 
	    	vibeagain=0;
	    	new CountDownTimer(30000, 10000) {
	    		public void onTick(long millisUntilFinished) {
	    	     }
	    	     public void onFinish() {
	    	         vibeagain=1;
	    	     }
	    	  }.start();
    	}
    }
 
    public int char2index(char c){ 
    	// given a string char, returns braille index (lookupChars)
 
    	int brailleIndex=-1;
    	for(int i=0;i<totalCharSet;i++){
    		if(lookupChars[i]==c){
    			brailleIndex=i;
    			return i;
    		}
    	}
    	return brailleIndex;
    }

Dots to Haptics (currently using hardcoded UHL FX, #27 and #30 for long and short vibes, respectively). They sound good on a LG Optimus 3D, but weird on an Evo. See below for serialization note.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
    public int[] dot2fx(int d[]){
    	int[] temp=new int[6];
    	for(int i=0;i<6;i++){
    		if(d[i]==1){
    			temp[i]=27;
    		}else if(d[i]==0){
    			temp[i]=30;
    		}
    	}
    	return temp;
    }
 
    // a serializer is actually needed to play the series of "dot haptics"
    public void playCharDots(int[] d){
    	Log.i(appName,"Playing...");
    	mLauncher.playSequence(dot2fx(d),COMPONENT_GAP);
    }

Here’s the flow of the app. On launch, a poll is made to my web server with the latlng, and reverse geocoding lookup is done on deCarta. The app vibrates the location (string2vibes()). Then, the user can input a series of long and short clicks to touchscreen input braille (parseNewDot()). Each touch currently gives you an instant feedback (long vibrate or short), and after 6 touches, the entire braille character is haptically-regurgitated via the combination of long and short vibes. Six short touches sends the SMS sends it off to the hardcoded phone number.

Due to time constraints, the app basically used “out of the box” UHL Effects, instead of custom-tuned haptics via Motiv Studio. There were some issues with having these UHL Effects play back to back simultaneously, so Jason from Immersion helped with a serialization function that got the above to play back simultaneously. Instead of using the default Launcher class, I used Jason’s LauncherEx.

A few caveats to note: be sure to set permissions for haptics and SMS in the manifest, or your app will crash (almost) immediately. (That took about an hour or two of frantic hair-pulling during the hackathon..) Also, you should disable default haptic feedback in your haptics view

main.setHapticFeedbackEnabled(false);

.

Procedural Braille Character Renderer

Someone also asked about how the braille text was generated. It was a quick hack since I thought displaying visual feedback of regular letters only would be lame. Here’s the outline:

The debug braille glyphs are generated using the

android.graphics.Canvas,.Paint

libraries – particularly, it’s a mix of un/filled circles bounded by unfilled rectangles denoting a braille character grid. I have a Ball and Rect class for easy constructor creation of the balls and rectangle that compose each glyph. The core functions of the RenderBraille class are below.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
    // multichars view dots are generated 10 from each
    float charSizeX=10; // each braille character is 10x20
    float charSizeY=20;
    float dotDistance=10;
    int dotRadius=3; // radius of each braille character is 3
    float dotType=0; // raised=1 or lowered=0
    float xmin=10;
    float ymin=10;
 
    public void newBrailleChar(int[] d,int cminx,int cminy){
    	float xmin=cminx*charSizeX;
    	float ymin=cminy*charSizeY;
    	float xmax=xmin+charSizeX;
    	float ymax=ymin+charSizeY;
    	main.addView(new Rect(this,(xmin+5),(ymin+5),(xmax+15),ymax+15,0));
    	for(int i=0;i<6;i++){
    		renderDot(i,cminx,cminy,d[i]);
    	}
    }
 
    public void renderDot(int dotCurrentIndex,int charIndX,int charIndY,int dotType){
    	int dotIndX=-1;
    	int dotIndY=-1;
        if(dotCurrentIndex==0){
       		dotIndX=0;
       		dotIndY=0;
        }else if(dotCurrentIndex==1){
       		dotIndX=0;
       		dotIndY=1;
        }else if(dotCurrentIndex==2){
       		dotIndX=0;
       		dotIndY=2;
        }else if(dotCurrentIndex==3){
       		dotIndX=1;
       		dotIndY=0;
        }else if(dotCurrentIndex==4){
       		dotIndX=1;
       		dotIndY=1;
        }else if(dotCurrentIndex==5){
       		dotIndX=1;
       		dotIndY=2;
        }
        main.addView(new Ball(this,xmin+dotDistance*dotIndX+charIndX*charSizeX,ymin+dotDistance*dotIndY+charIndY*charSizeY,dotRadius,dotType));
 
     }

You can generate a line of braille glyph by calling

newBrailleChar(dots,charLength*2,0); /* dots is an 1x6 array that is 1 for raised and 0 for lowered, counting from left column down to right, as with convention above. */
]]>
http://yosun.me/2011/06/26/tap-shake-messenger-from-muthermobile/feed/ 0