Had a lot of fun working on the Microsoft Azure U.S. Hack for Accessibility. Paired up with 4 really great and random teammates on this one. We would up making AI Dog, which was a service that specialize in providing accessible directions for users to safely and more easily navigate to a campus. Here’s the entry on DevPost.
I had a great experience working with this team and thought we came up with a very creative idea. There’s a lot going on in the project, but it ultimately uses the Google Streetview API to grab photos of a college campus, that we then analyze using the Azure Computer Vision API to determine whether the photo is an accessible scene or not. We then use the data from the location photos to determine the most accessible route a user is searching on using our accessible website.
One of our teammates did an amazing job setting up the backend in Python. I’ll try to find the path to his repo. It really was amazing work, I’m still impressed at his skills. It was ultimately turned into an Azure Function, which we could then call on a simple web page.
I really thought we had a good shot at winning this one, but the other entrants were just as if not more deserving as well. What a great experience. If I could go back I’d have us spend just a little bit more time on the presentation.
So I entered another hackathon through DevPost. This one was for Terminal >Hackathon Tech Takes On Mental Health. We had to come up with something for mental health.
I paired up with with someone from the previous hackathon and I believe two others joined in. With the short turnaround I was the only developer on this one, but everyone chipped in other ways.
We would up making Tell Me Something Good. It’s a basic page that allows a user to input how they are feeling. It then sends that submission to the Google Cloud Natural Language REST API. The API takes the text submitted and analyzes it’s sentiment, returning a score. A 1 or it’s very positive, a -1 if it’s extremely negative, with increments in between.
It’s actually a pretty slick API. We threw all sorts of sentences and paragraphs at it, and the ML (machine learning) really does an amazing job of giving accurate results.
An example of the project can be found here on Glitch:
This one was very fun too. It took me a little bit to the the authorization going with the REST API, but once I got it going it was very flexible and easy to work with. I’m starting to get some very creative uses of sentiment analysis.
I signed up pretty early on, but was having a really hard time finding teammates. I asked some friends and some strangers with no luck. Just as I thought oh well, it’s a no go for me I got a request to join a team.
My team was great, with people located all over the country. Many in school and some with full-time jobs. It was a solid mix.
We ultimately developed PrivIQ. It is an Edge Extension that helps simplify if a site collects private data on a visitor.
What really impressed me most was how fast the Text Analytics analyzed and returned the response. I was sending over 5000 characters and it was milliseconds in its return. I wasn’t expecting such an immediate response.
So once we got our response of key words, we then compared them to our list of privacy terms we were looking for. Depending on how many matches were returned we either alerted the visitor through the Extension that yes your data is being collected, or that it may be being collected.
Other team members worked on the design of the extension, the web scraper that would called the privacy text from a site and putting it all together in an Edge Extension.
Unfortunately we had a hard time getting our scraper to work in the Extension. There were some CORS issues (which did make sense), and we attempted to called an outside server side script like and API at the last moment, but alas ran out of time.
I would like to continue to learn how to better write an Azure Function. That’s the route I would have like to of taken.
We got our almost fully operational example submitted just a few minutes ago, and am really glad I participated. It really was a lot of fun and forced me to learn some new/better ways of coding.
There’s a lot going on under the hood with this one, and we’ve utilized what I think are some pretty cool techniques to make it not only easy to use but also easy for our content managers to keep up to date.
Our CMS (content management system) is OU Campus which specializes in higher ed. They’re great for us. Their templating system relies on XML/XSLT (not always so great), so ultimately a content manager typically is editing an XML file on the staging server that is then transformed into an HTML file that live on the production server for visitors to see.
So to make editing as seamless as possible, we setup a table is OU Campus that our content manager can edit, just like any regular page. Just what they’re comfortable doing day in and day out. We then leverage some XSLT magic (there are parts of it that I still believe to be magic even though I wrote it) to transform the content managers edits. Here’s what’s happening when they edit:
Content manager makes normal edits to a html table with columns, rows, etc. Each degree is a new row in the table.
On publish the XSLT transforms the edited XML into 2 files. A standard HTML page of the table transform and grouped by degree type.
This is the most basic but functional version of the page. It’s served up like this to meet any visitors who may be visiting with the most ancient of browsers. They will still at minimum get a fully functioning list of linked degrees.
The table is also transformed via another XSLT file to a JSON file. This JSON file will serve as our data source of areas of study. The external JSON file makes it easier to use the data in other pages/applications too!
So now we have 2 files. The HTML page which is fully functional, yet a tad boring and a JSON file just waiting to be used.
We didn’t use the vue-cli, just some vanilla Vue for this one. If we do a 2.0 we’ll probably go with vue-cli as we have a much better understanding of it now and it’s benefits.
4. The Vue script not only filters by title, school, etc. but we also include a field for tags. This really opens the door for us to make sure that our degrees can be found not only on their proper name, but perhaps a name that is used at other institutions, or even if searched as a career driven term. This should be a big upgrade for all.
5. To top it off we collect Events in Google Analytics of terms searched on. This data will help us make sure that the tags being used make sense and if we should consider adding new tags or even renaming or launching new degrees in the future.
All in all a very fun project that really helps to modernize a highly used page on our site. We’re looking to incorporate the JSON data into other apps/uses as well.
The course was really great. I had to put it down for extended periods then come back due to my heavy workload at work, but the instructor Maximilian really put together a very solid Vue.js course. Highly recommended
The final project was a stock trading app. The final stock trading app did a good job of utilizing Vuex and Vue Router too, so that was a plus for sure.
Here’s some of the issues or differences I had on the final project:
Sometimes I still go with ES5 style instead of going with the ES6 Higher Order functions. I forget .forEach, .map, etc. but am getting better of giving them a short as well as the good old for loop that I tend to lean on.
I went with Bootstrap 4 instead of 3 that was used in the tutorial
For now I’ve omitted the Firebase integration. I’ll probably go back and include that at some point, but was thinking of implementing it in a slightly different way that was was proposed in the lesson.
Had to change the publicPath in config for the project to work on GitHub Pages. This had me stumped for a few minutes, but made sense once I thought it through.
Attended the OUTC conference in Anaheim, CA again this year. Presented again on Using Google Lighthouse to Find a Faster Website.
The trip out there was a little rocky with a decent amount of turbulence. While I typically do not travel very well, I was happy to not feel super lousy when arriving at LAX! Usually I’m in bad shape for the first day after a flight.
I was able to check in super early at the hotel, and took a much needed shower. I do think that was a key factor in me not feeling too poorly that day.
Arriving early on Sunday, we headed out to Irvine Regional Park. This place was awesome! The mountains, openness, so many cool things to do. Though I probably only saw 1% of the park, it was pretty awesome to take in all the differences that I would not find in New Jersey. A great experience.
Getting back to the hotel, we were served In-N-Out Burger from a food truck. The Double Double really it the spot, I was starved. It was great to meetup with earlier OUTC friends Caleb and Mark. They’re good guys.
Later that night, with a little bit of hard work was able to get HBO Go to stream on the large TV in the hotel room. On the west coast there was the option to watch it a t 9:00 pm or stream it at 6:00 pm EST. I had to present the next day and flew in that morning, so getting it to stream and watch the early time was a great success.
On Monday the sessions kicked off. I presented at 2:00 pm, and while I think I got off to a shaky start overall I think it went well. It wasn’t a very talkative crowd, but I could tell most everyone was following along and testing their site out using Google Lighthouse. It was fun.
The OU Campus 11 demo was pretty cool. I am excited at the direction the interface is going in. I think our content managers will be very happy with the coming updates.
I always look forward to hearing about their Product Roadmap, and that took place on Tuesday. The mentioned the following:
WCAG 2.1 (April 25)
Web Hooks (June) – Trigger actions outside of OUCampus. Can be assigned on folder.
Image Size Sets (July) – Set size (crop or not), group to set, assign to folder
File Uploads for LDP Forms (September)
Feed Manager (Winter) – easier feed management
OUCampus 11 (Winter)
Phase 1 – look and feel
Later phases to workflow
Formstack – available in Marketplace
Looking Further Forward
Accessibility Check/Insights Improvements
V11 Phase 2+
We’re exciting to see exactly how Web Hooks will work. There is some decent potential there.
Later that night was the Hackaton. While I’ve been lucky enough to of been on the winning team in the last two events, I wanted to try a different role this year. I partnered up with Aaron, Soe, Fernando, and Nick in that order based solely on seating position when the Hackaton begun.
Aaron really wanted to develop a command line interface for OU. Fernando was all in with the idea, and quite frankly I was a little on the fence. In the past I’m leaned on user driven ideas, not so much admin. But Soe and Nick were all in, and I really wanted to try to enter with someone else’s idea this year.
I can’t say this enough, but we really worked so well as a team. Fernando was a command line genius and Aaron and Soe were great coders as well. They did such a great job, and far exceeded what I thought would get done in that timeframe.
Nick and I decided to focus on the presentation and marketing aspect of our team. We came up with talking points and images to be used. We even used the official presentation template (since I was a presenter) and got some big chuckles from the audience. Presentation can be a big factor in hackathons and we had a lot of fun with that.
It was a good thing that we worked so well as a team as there was some very stiff competition this year. Translation tools, image compression gadgets, XSLT data parsing, so many great entrants. In the end we did win, and I somehow would wound with with three in a row. Insane.
The last days workshops were very cool too, but there was so much going on in them I will have to revisit my sandbox and review the code before I forget too much. I’m very interested in learning more about PCF data outputs, especially outputting to JSON.
The flight home was smooth and a great way to end a great conference. While I’m not learning quite as much as I used to, it is fun to be in the role of the veteran who is now sharing knowledge with the newer folks to the platform. Glad I was able to attend again and looking forward to working in all the new ideas that were shared!