This is more of a conversation starter than a tutorial post. I often find myself using
curl on API endpoints from the Terminal and then filtering down the result to just the bits I need. It is one of those common tasks that could be done a thousand different ways using one of a hundred different utilities.
Let’s say for example I query the Goo.gl URL shortener API and get this response:
Pretend all I care about is the short URL from this JSON response. My go to method for tackling this would be to use
sed. It is quick and simple, plus I know the syntax by heart so I could knock this out in seconds. It would look something like this:
echo $RESPONSE | grep 'id' | sed 's/\"id\"\:\ \"//' | sed 's/\",//'
This works well because the short URL is returned on its own line so
grep can easily extract it. The two
sed commands swoop in and clean up what is in front of the url and then what is behind it. If the URL was not on its own line I would probably use
grep -o with a regex pattern to pull out the
id URL portion and then use the same
sed method to clean up around the result.
I am not saying this is the best method by any means. It works well for me but it is acutally quite rigid in implementation and will break as API responses change over time.
My question to you is — How would you tackle this simple task? What utilities, languages, commands, etc… would you use to take a
curl response and filter it down? I am truly interested to hear from you. Respond with a blog post if you have a site or just hit me up on Twitter.
On a side note, while working on this very Goo.gl link scenario, I actually used an app called CodeRunner to experiment with different methods. It is an excellent app for quickly testing ideas. Definitely check it out.
The emerging icon virtuoso company, Icons & Coffee headed up by Silvia Gatta, has released yet another great icon set — this time aimed at iOS 7 developers. Although these icons could be used for a plethora of non-iOS projects, they will definitely feel right at home on the new OS. Silvia is the girlfriend of friend, Federico Viticci, so I had the pleasure of giving feedback on these icons as well as helping them test the Essence website so I know first hand the level of detail and effort they put in to this project. I am very happy to support them by buying the icon set and I hope you will find them useful as well.
One thing I always look for when buying an icon set is the license, so that I make sure it fits my needs. I was happy with the extremely flexible license that Icons & Coffee shipped with the Essence set.
Here is an excerpt:
* Attribution-free: You don’t have to credit the name of the author of Essence in your project. We, however, appreciate links and tweets about Essence. If you use the icons in a project, you can send us an email, and we’ll put a link on our website.
* You can use the Essence icon set for an unlimited number of products and/or clients.
The Essence icon set includes 300 handcrafted icons in both 1x and 2x flavors as well as Adobe PDF and AI files for additional uses. The set is available for $19.99 which is a great price for such a liberal license.
Here is a link to buy them: Essence by Icons & Coffee
I haven’t had much time to write on this blog but I hope to change that. Even when I am not writing about Apple topics, I honestly live and breath this stuff in both my personal and professional life. One question I get all the time from non-techie friends and family members is – What is iCloud and where are my photos? I know those are actually two separate questions but whenever it comes out of someone’s mouth, it runs together like a single stream of bewilderment.
I was recently helping a family friend with photo management on her iPad and she had asked that very question. She had taken a lot of photos during her last vacation, so many in fact, that she had run out of space on her iPad and the device had no room left to download and install iOS 7. She enabled “iCloud backups” and “Photo Stream” yet she truly had no idea what either service actually did for her and her photos. Confused and unsure whether her pictures were stored local, in the cloud, both or neither, she decided to take the advice of her hairdresser – download Dropbox. She was told that if she downloaded the Dropbox app it would back up her photos right on her iPad. So, this non-techie person downloaded the Dropbox app and signed up for an account. She then uploaded all her images right from the device. Afterwards, she felt comfortable safely deleting them out of her iPad camera roll. She could see them on the web and on other computers which was all she wanted. Knowing that she was not a very tech savvy individual, I was a bit surprised that she had accomplished this entirely on her own and I think she was equally as proud of herself.
Apple does so many things amazingly well, however, photos in iCloud is obviously not one of them. I personally use Photo Stream because I have taken the time to fully understand it. Photo Stream takes pictures off of my phone and wirelessly transfers them to iPhoto on my Mac which then gets backed up via Time Machine to my Time Capsule. This works well for me but it’s not for everyone and especially not for those that have one foot in the Apple world and the other foot in the PC world. Whenever this is the case, Apple services seem less than intuitive and they rarely succeed at solving their intended problems because those solutions were designed with an all Apple ecosystem in mind. Apple could learn a lot from quality services like Dropbox. I don’t mean that Apple needs to improve their cross-platform technologies but I think simply explaining their services in more detail as opposed to always trying to provide invisible implemenations of new services could go along way towards making the average user have a better experience. This is not a new revelation by any means. Just something that has been on my mind lately and I wanted to get it out.
I half-jokingly asked David on Twitter to make an Alfred 2 (Beta) workflow that could quickly search Twitter’s new downloadble tweet archive, but knowing he is a busy guy I didn’t really expect him to make it. However David came through and really delievered on another excellent add-on to Alfred.
This Twitter Archive workflow is easy to configure and simple to use. Just install the workflow and run the import command including the folder path to the csv data like in the example below.
The workflow will generate its own SQLite database of your tweets, making search lightning fast. Selecting a tweet from the search results will open it in Twitter.com or you can hit CMD+C to copy the URL to your clipboard. It works flawlessly.
Download the Twitter Archive workflow
Also, be sure to check out his blog http://dferg.us which is updated regularly with tons of Alfred goodies and follow him on Twitter @jdfwarrior.
Alfred Mega Supporters are currently rejoicing the first beta release of Alfred v2. One of the long awaited features, teased by the developer, are the new workflows that enable users to develop comprehensive actions using many of Alfred’s powerful features wrapped up inside of neat little packages. David Ferguson has already whipped up some great examples like his Rdio and Mail search workflows. I thought I would share my first workflow I created for the beta Alfred v2 called Gitfred. It is a workflow for interacting with your GitHub account, specifically for quickly launching repos, issues, and gists.
To use Gitfred, type one of these commands:
- repos – list available repositories
- issues – list open issues
- gists – list view gists
*To use Gitfred, you must enter in your GitHub credentials. To do this, download and install the workflow, open the workflow folder and edit the gitfred.py file. Change the USERNAME and PASSWORD variables, then save the file and close it.
Gitfred is powered by the PyGithub project