Making my Twitter feed richer with Zapier and hnrss

twitter photo

Photo by marek.sotak

I read Hacker News, a site for startups and technologies, and occasionally post as well.  A few months back, I realized that the items that I post to HN, I want to tweet as well.  While I could have whipped something up with the HN RSS feed and the Twitter API (would probably be easier than Twitversation), I decided to try to use Zapier (which I’ve loved for a while).  It was dead simple to set up a Zap reading from my HN RSS feed and posting to my Twitter feed.  Probably about 10 minutes of time, and now I doubled my posts to Twitter.

Of course, this misses out on one of the huge benefits of Twitter–the conversational nature of the app.  When my auto posts happen, I don’t have a chance to follow up, or to cc: the authors, etc.

However, the perfect is the enemy of the good, and I figured it was better to engage in Twitter haphazardly and imperfectly than not at all.


Consolidate external dependency notifications using Zapier

binoculars photo

Photo by M1key.me

As I wrote over at the Geek Estate Blog, if you build your business on vendors, you should monitor them.  In the past, I’ve used a variety of services to monitor vendor services, from pingdom to wget/cron to nagios.  These services are great about telling you when some external service is unavailable, but are not so hot at telling you when a service is going to be down (for planned maintenance) or back up.

For that, you need to be monitoring, and reading, vendor announcements, however the vendor has decided to provide them, whether that is as a blog/RSS feed, twitter feed, email newsletter, a status page or something else.

However, it can be tough to monitor and read announcements in two or more places.  Here, Zapier or a similar service can help.  Pick one place to be notified.  For me, that’s typically an email inbox, because, frankly, other data sources can be ignored (except phone texts), but I’ll always check my email.

Then, use Zapier’s zaps to transform any announcements from the other sources to emails.  For instance, there is an RSS trigger for new items in a feed and a Twitter trigger for tweets from a user.  Status pages often provide RSS feeds (Google’s does).  If the service provider doesn’t provide a structured method like an RSS feed to notify you of changes, but does provide a webpage of announcements, you could look at a service like changedetection.com and have the email sent to your inbox or parsed by Zapier and pushed to your notification location.

And for the output side, you can just use Zapier’s ‘send outbound email’ action.  If you want to have all notifications pushed to your phone, an RSS reader or Twitter acount, you can use Zapier to send texts, create RSS items or tweets as well.


Building an automated postcard mailing system with Lob and Zapier

Courtesy of smoothfluid

Courtesy of smoothfluid

I was looking at automated paper mailing systems recently (and listed what I found), and was especially impressed with Lob, especially the ease of its API.

Among other printing services, Lob will let you mail a postcard with a custom PDF on both sides, or a custom PDF on one side and a text message on the other, anywhere in the USA for $0.94.  (Sorry, not sure about international postcards) The company for which I work sends out tens of thousands of postcards every quarter. The vendor which we use charges them a similar fee (less, but in the same ballpark) but there’s a manual process to deliver the collateral and no API. So an on-demand, one by one post card sending system is very interesting to me.

Note that I haven’t received the Lob postcard which I sent myself, so I can’t speak to quality. Yet.

The Lob API is a bit weird, because the request is form encoded rather than a JSON payload.  It also uses basic auth, but only the username, not the password. But the API seems to have all the pieces you’d need to generate all kinds of postcards–reminder postcards, direct mail postcards, photo postcards, etc.

After testing out the service via the web interface and cURL examples, I thought that it’d be fun to build a Zapier zap. In particular, being able to send a postcard for an entry in a Google spreadsheet seemed like a useful use case. Plus, Zapier is awesome, and I’d wanted to test out their integration environment for myself.

So, I built a Zapier integration for Lob, using the Zapier developer docs in combination with the Lob developer docs. It was actually easy. The most complicated step was translating the Zapier action data, which is a one or two dimensional array of typed data, into the Lob data format, which wanted a couple of text fields and two address arrays. Zapier has a scripting environment that let me modify data from APIs pre and post send, and even had an example about form encoded APIs. Zapier’s JavaScript scripting development environment was full featured, including syntax and error highlighting. It had no real debugging available, but I could use the venerable debug-by-log-statement method fairly easily.

Where could I take this next? Everywhere people use postcards in real life. The postcards depend on PDF files (see a sample), so if you are generating a custom postcard for each interaction things become more complex, but there are a few APIs (based on a 30 second google search, here and here) available for dynamic PDF generation. There are also limits on API call throughput, if I stuck to the Zapier integration–I could send at most 300 postcards a day, unless I managed multiple spreadsheets.

I see reminders of high value events (dentist, house maintenance, etc), contests and marketing as key opportunities for this type of service. And if I had a product where direct mail was a key component, using Lob directly would be worth serious consideration.

Regarding the Zap, I believe I cannot make this Zap available to anyone else. Since I’m not a representative of Lob, I couldn’t commit to maintaining this Zap, and Zapier doesn’t want to have any of their customers depending on an integration that could disappear or be unsupported at any time–a fair position.

If the Zapier or Lob folks want to take this integration and run with it, I’d be happy to share my code–just leave a comment. If anyone else is interested in being able to generate Lob postcards from a Google spreadsheet (or any other compatible API) via Zapier integration, let me know and I’ll see what I can do.


Zapier is awesome!

APIs are sprouting up everywhere. This is great for developers (and for end users) because it allows all kinds of automation. However, there are times when the investment of writing code to connect two APIs is too much.

Why might writing code be too much?

  • The problem is still fluid and writing code will lock in a solution.
  • There are more pressing business problems to solve.
  • There are no engineering resources available, and/or no money to hire a dev.

If any of these reasons apply to a problem you are facing, consider Zapier (or a competitor like IFTTT). These services are much like Excel macros–they require less software engineering expertise but can leverage some of the power of programming to automate away work. I’m only going to write about Zapier, since that is the solution with which I am familiar.

Zapier runs the connections between each service (called ‘Zaps’) at regular intervals. Zaps are built using a web only interface that leverages the APIs in a manner that, while not completely intuitive, is thoroughly manageable by anyone who can sum up a column in Excel.

Here’s the class of problems for which Zapier is good:

  • Connecting two services for which Zapier has connectors–this list is quite extensive.
  • Syncing needs to happen no more than every 5 minutes (15 for the free account).
  • No processing of the data during the transfer is needed (except possibly omitting some fields)–you are simply moving data from one place to another–no Yahoo! Pipes like transformations are possible in transit.
  • One way sync is OK (though there are workarounds).
  • You don’t need to bulk load initial data via Zapier–you either can disregard initial data or load it in some other fashion.
  • You have a reasonable number of sources and sinks–each linked source and sink will take up one Zap.
  • You have to have valid accounts with each source and sink.

That’s a fair number of limitations, but even so there are a large number of common problems that are solvable by Zapier. Some examples:

  • Syncing a list of contacts from one source to another.
  • Taking a google form submission and adding a user to a mailing list.
  • Moving a row from one Google spreadsheet to another.
  • Taking an email and adding it to a database.
  • Adding a customer who you have just invoiced in QuickBooks to an email list.

These are all types of problems that can be done manually, but if frequency or scale increases, the process can run people ragged.

I want to call special attention to the email processing ability of Zapier. If you have a well-formatted email that you often receive that you want to further process, Zapier can parse it into interesting fields and send that data along to a sink like a Google Spreadsheet. Examples of well-formatted emails include order confirmations, contact us forms, and newsletter subscriptions.

I have found the Zapier support folks very responsive, whether that was troubleshooting an issue with Google docs, finding out how to pronounce the company name, or explaining why having 100 Zaps reading from one Google spreadsheet was a bad idea. Having responsive support staff reassures me, because once Zapier gets embedded into business processes, ripping it out is going to be very painful and a lot of work.

You can also write your own Zaps if you have custom APIs that you’d like to integrate with. I haven’t explored this much–it does seem like a developer centric task.

Zapier is not an all purpose tool nor a total replacement for developers, but it is definitely a great app to have in your toolbox. Take a look and see what little (or big) niggling problem that you haven’t had time to write code for (or, if you can’t write code, what you haven’t been able to get an engineer to write code for) it might solve.


Getting the good content out of a Facebook group

I am astonished at how hard it is to get information out of Facebook groups.

The startup of which I am a part has created a Facebook group for dissemination of information of between commercial kitchen operators.  This was easier to get started than a forum and has the advantage of everyone being a “real person”, or at least real enough to get a Facebook account.  It also benefits from the ubiquity of Facebook–many many folks have it on their phones and get notifications about group activity.

However, it has the detriment of being a “walled garden”, with the content of the group being unavailable for searchers on the web.  Some might argue that privacy actually is a good thing, because it will encourage folks to be more honest, but really, anything you put on Facebook can be cut and pasted and made public, so I’m not sure I buy that argument.

Regardless, I wanted to find an easy, automated way to take the Facebook group content and pipe it elsewhere, where it could be reified and curated.  A human could do that, but I’d like an automated solution. And, other than the Facebook API, I haven’t found many.  Zapier (my go to integration choice) only recently released this as an option (I’m thr last few months.  IFTTT doesn’t have it.  There’s no commercial solution that I could find that does this.  There are, however, some open source solutions.

The Facebook API makes it fairly easy to grab the posts of a group, and from the posts, the comments, but frankly, I really want a solution that doesn’t require coding up the JSON parsing/pagination handling/Oauth access.  I just tried the facebook-export tool and it seems to work just fine (though I had ended up having to update the leveldown/levelup versions to 1.5/1.3 to get past a compile error: leveldown.target.mk:114: recipe for target 'Release/obj.target/leveldown/src/batch.o' failed). It gives you all your posts as JSON.


Lob Postcard Review

A few months ago, I wrote a Zapier app to integrate with the Lob postcard API. I actually spent the 94 cents to get a postcard delivered to me (I paid 24 cents too much, as Lob has now dropped their price). The text of the postcard doesn’t really matter, but it was an idea I had to offer a SaaS that would verify someone lived where they said they lived, using postal mail. Here are the front and back of the postcard (address is blacked out).

Lob Postcard Sample, address side

Lob sample postcard, address side

Lob Postcard Sample, front

Lob sample postcard, front (from PDF)

Here is the PDF that Lob generated from both a PDF file I generated for the front (the QR code was created using this site) and a text message for the back.

A few observations about the postcard.

  • The card is matte and feels solid.
  • The QR code is smudged, but still works.
  • The text message on the back appears a bit closer to the edge on the actual postcard than it does on the PDF image.
  • The front of the postcard appears exactly as it was on the PDF.
  • It took about 5 business days (sorry, working from memory) for delivery.

So, if I were going to use Lob for production, I would send a few more test mailings and make sure that the smudge was a one off and not a systemic issue. I would definitely generate PDFs for both the front and back sides–the control you have is worth the hassle. Luckily, there are many ways to generate a PDF nowadays (including, per Atwood’s Law, javascript). I also would not use it for time sensitive notifications. To be fair, any postal mail has this limitation. For such notifications, services like Twilio or email are better fits.

In the months since I discovered Lob, I’ve been looking for a standalone business case. However, business needs that are:

  • high value enough to spend significant per notification money and
  • slow enough to make sending mail a viable alternative to texting or emailing and
  • split apart from a larger service (like dentist appointment scheduling)

seem pretty few and far between. You can see a short discussion I kicked off on hackernews.  However, they’ve raised plenty of money, so they don’t appear to be going anywhere soon.

But the non-standalone business cases for direct postcard mail are numerous (just look in your mailbox).


I Want to Pay You Money! (Except When I Don’t)

money photo

Photo by 401(K) 2013

I saw this post from Kin Lane talking about Zapier and how one of the many advantages it has over similar services is its pricing.  I completely agree.  While I like free as much as the next person, when I’m building on something, I want to pay for it, or at least have it monetized in some fashion (Kin has a nice list of ways for API providers to monetize).  Paying for a service means:

  • the company can survive
  • great employees can be paid
  • when I complain, the company has an incentive to listen
  • the value I get from the service is above what I’m paying (aka consumer surplus), if I’m a reasonable facsimile of homo economicus.

All of these are really nice attributes of a technology I’m going to build on (not just ‘date’ as Kin says).  This is an interesting dichotomy, because the fastest way to growth is to provide a free service–then there’s no friction to signing up.

I guess the answer, at least for software products where the marginal cost is very very low, is a freemium offering, like Zapier.  Get the user in, show them how your value proposition works, and then ask them for money when they are hooked.  Just don’t make the freemium level unusable!


Building a System with IronMQ and Python

messages photo

Photo by andrewrennie

One of my most recent projects was writing a system to deliver real estate listing data to a content management system. The CMS was not in my control. Since the listing data source was bursty and I wasn’t sure how the CMS would handle the load, I decided to use a message queue, where the messages would have a JSON payload. Message queues are great at decoupling components of a system.

For the queue, I used IronMQ. The company already was using it, it has a free tier (up to 24 messages a second), the service has been stable and reliable, it has great language SDKs, and setting up a durable message queue is something I’d rather outsource. (I do wish Zapier supported it.) In other situations (when posting messages from mobile apps), we ran Varnish in front of IronMQ so that it could be replaced easily. In this case, we didn’t because there were fewer moving pieces (it was server to server communication and it would be easier to swap out IronMQ should that be required).

I wrote the bridge code from the listing database to the message queue in python. The shop was mostly Java and some python, and python seemed a better fit for a small ‘pull from here, push to there’ application. I used pytest for unit testing, jenkins to run the unit tests in a CI environment, and autopep8 for formatting. My colleague was a more experienced python programmer, so I was able to lean on him for questions. I didn’t find python hard to pick back up (I’d scripted in python a little years ago), and it was a fun language to code in. Reminded me of perl w/r/t packages and quick developer feedback. I did miss Java’s dependency management, though (my college recommended virtualenv as a possible solution).

The JSON payload would allow developers writing the message consumer to use almost any language they wanted–any language if they used the IronMQ REST API rather than an SDK.

I can’t share the code, but for this kind of problem, python was a great solution. And I’ll reach for IronMQ any time I need a message queue. This pair of technologies was quick to implement, easy to deploy, and high performance wasn’t really a requirement, since the frequency of the listing delivery was the real bottleneck.


Saying Goodbye to 8z

Today is my last day as an FTE. After four years as an employee, and five years as an on and off contractor, I’m moving on from 8z. Partially–it looks like I’ll be doing some contracting.  8z is a great company that I think will be fixing what is broken in real estate for the foreseeable future.

I learned a lot in my time.  Here’s a top N list:

  1. Hiring is hard–I had the luxury of hiring a few employees and ICs, and it was a grind to find the right person.  I don’t know if that was because my budget was tight (so I was really only looking at entry or junior level candidates) or the team was small (so I needed someone willing to wear at least three hats–ops, developer, QA) or my location (Boulder) or the fact that developers are in high demand, but it was a grind. However, when I did find the right person (and I think I had a pretty good batting average) it was great to see the team gel and people advance in their skills and careers.
  2. Budgets are fiction, but a necessary one.  The planning required for a budget enforces rigor, but the actual spend almost always changed. Like Eisenhower said.
  3. Blogging is harder to do when you’re an employee. It’s just harder to find the time.
  4. Zapier is awesome.
  5. Managing while being an individual contributor is hard.  In a small team (we were never more than four), I (and everyone else) wore many hats, including PM, tech lead, developer, QA, DBA and business analyst.  From my reading–looking at you, Rands–some of the difficulty was intrinsic to management, not just management of a small team while wearing many hats.
  6. Having monthly one on ones with your direct reports is a bare minimum.  If you aren’t having these meetings with your reports or your superior, start doing so. The open lines of communication will pay dividends long term by making sure you aren’t surprised.
  7. Using the database as a bus lets you have a number of smaller isolated more easily understandable software processes.
  8. Being a software developer in a small team at a real estate company made it hard to find peers around Boulder. I didn’t fit into the startup culture nor did I fit into the bigCo culture.
  9. Default to using innodb.
  10. Core values can actually matter. If the CEO makes a regular effort to help them live.
  11. You can ship a lot of software with a small team.  But code rot will eventually bit you–here Jenkins and automated tests (junit, dbunit, qunit, phpunit, py.test, pentaho kettle testing, etc) are your friends. Jeff, thanks for pushing hudson!
  12. Gmail is an eminently usable web client.
  13. Having two equal co-founders is a tough row to hoe.
  14. Some people can tell the difference between Budweiser and Coors with their eyes closed.
  15. Google Apps is going to severely wound MS Office, and LibreOffice might do the rest.
  16. There is no substitute (none!) for asking the person who knows.  Many times I’d be discussing requirements or a bug, and come up with a question.  After batting it around a few times, the best course was always “well, let’s just ask XXX”, where XXX was the person who knew.  The nice thing about a small company is that the right person to ask is usually obvious, and if the first person you ask isn’t the right person, they know who is. Don’t speculate–find out!
  17. MLSes … if you are in real estate software and you want listing data, and you do, be prepared to spend a significant amount of time understanding your local MLS compliance rules.
  18. Some of my biggest wins in the last four years were finding the right tech partner, not building the software from scratch.
  19. APIs make sense, even in the smallest company.  Even if you are never going to expose them to the outside world for gain (and more so if you will).  They enforce a rigor and provide interfaces for polyglot development.  And you can be surprised where they’ll be useful.
  20. Java has a tremendous ecosystem.  This includes third party jars (like this one for querying polygons), frameworks like spring, and tools like maven.  I know lots of folks aren’t impressed with maven, but I found once you get your head around it, it can make your life easier. If there’s a build feature you want, maven already has it.  Plus, dependency management kicks ass–I hope that npm learns more from maven than just avoiding XML.
  21. A small team working primarily in Java seems to be an anomaly in this day and age.
  22. Having a consistent tech stack vision is good–it leads to reuse and fewer operational issues.
  23. It is far more satisfying to ship a 90% solution than not to ship at all.
  24. Hackfests and brown bags aren’t just for software companies.  With the support of the CEO, I introduced both of these to the wider company (see my post about monthly brown bags and semi annual hackfests) and they resulted in cross pollination, public speaking skills for employees, and some great ideas, including one that became a core 8z initiative.
  25. Having software in maintenance mode is tough on a team.  One of the biggest tasks for my team for about half of my time at 8z was to maintain an old complex website (10+ years old) just enough to keep it running. It was and is crucial to the business, but the business was trying to migrate away from it for a number of reasons (gobs of technical debt, among others).  This required painful choices, because often the right technical choice (fix the root cause) was the wrong business choice, and technical band-aids were the right business choice. That wears on me.
  26. Maps in Apache Cordova are hard to get right–the single threaded javascript execution environment really stinks when it comes to complex maps. (ht @intalex)
  27. A five character domain name gets a lot of purchase offers. You can also tell, to a first approximation, how technical someone is by how they react to your short domain name.
  28. Beer Friday (aka Friday afternoon club) is a great way to get people to interact cross department, but doesn’t work for everyone–not every coworker wants to hang out and have a beer.  Having other ways to cross pollinate makes sure everyone can participate.
  29. It is addictive to hear from consumers or other users that your software made their lives better.
  30. I learned as much or more from my failures as I did from my successes.  To name some particular personal failures, I made a hire that didn’t work out, a mobile app of which I was owner of that didn’t get to production (or even to QA) and a vendor who I evaluated, recommended and implemented that ended up not being a fit.  In each case it felt really crappy when it happened.  After I had some time and perspective, I was able to identify some lessons that hopefully will keep me from making the mistake again. You might hear more about these mistakes in the future.
  31. Finally, I enjoyed the the opportunity to be the primary technical voice at the executive table (while I never had the title, I was the effective CTO for a time). Bringing software engineering practices into a real estate company was fun and I think made 8z a better place to work. Of course, I could not have done any of that without the support of the owner/CEO, but it was fun to see what stuck and what didn’t.

The primary reason I joined 8z was that I felt after seven years of contracting and consulting that I wanted to be part of a team again. I also thought being an employee would let me grow in ways that you can’t as a contractor. Both of these turned out to be true.

Thanks for four great years!


How to access your QuickBooks Online data via API from the command line

As is so often with this blog, I’m documenting something that took me a long time to figure out.

The problem: the company I work for has data in QuickBooks Online (QBO) that we’d like to distribute elsewhere in our organization.

The answer: QBO has an API. Yahoo! They have several different SDKs (for .NET, Java and PHP) to make access even easier. (The APIs and surrounding docs are called the Intuit Partner Platform, or IPP.)

However…

The issue is that requests to the QBO API must be authenticated with OAuth (I believe it is OAuth 1.0a). And the entire Intuit Partner Platform documentation is focused on needs of developers who are building SaaS applications that consume or augment QBO data and are accessed via the QBO webapp. Which means there’s a heavy focus on web applications and OAuth flows via web applications.

But all I wanted was a command line client that could use the API’s query interface.

I was stymied by OAuth. In particular, I couldn’t find a way to get the accessToken and accessTokenSecret. I tried a number of different tacks (I beat my head against this for the better part of day).

But I just couldn’t find a way to generate the needed tokens, with either the PHP or Java SDK clients (most of the links in this post are for the Java client, because that’s more mature–the PHP client doesn’t support JSON output or have reference documentation).

Desperate, I turned to the IPP forums, which were full of advanced questions. I did stumble on this gem: “Simple way to integrate with our Quickbooks Account”, which took me to the OAuth Playground for IPP Developers. And, voila, if you follow the steps in the playground (I used IE because FireFox failed), you will end up with a valid accessToken and accessTokenSecret.

So, with that sad story told, here’s exactly how you can take access your own QBO data via the command line (I only cover a trial account here, but believe the process is much the same with a paid account):

  1. Start your IE browser (I’m using IE 10 on windows 8)
  2. Go to https://developer.intuit.com/us
  3. Sign up (click the ‘join’ link), click ‘remember me’
  4. Go to your email and find the verify link that was sent to you. Paste it into your IE browser.
  5. Sign in again
  6. Click on ‘My Apps’
  7. Click on ‘Create New App’, then ‘QuickBooks API’
  8. Fill out the name of the app, and the other required items. You can change these all later (I think). I know you can change the URLs later.
  9. Select the level of data access you need. Since this is a test app, you can select ‘All Accounting’
  10. Click ‘Save’
  11. Open up another tab in IE and go to the QuickBooks Online site (We are just adding some dummy data here, so if you have an account, you can skip this.)
  12. Click on ‘Free Trial’
  13. Click on ‘QuickBooks Online Plus’
  14. Click on ‘Already have an Intuit user ID’
  15. Fill out the username and password you used on when you signed up for your developer account.
  16. Ignore the upsell, if any
  17. Click the customers tab
  18. Click on the ‘new customer’ button
  19. Enter a first name and last name then press save
  20. Open a new tab and go to the API Console
  21. Choose the company that you want to access, and note the number next to that name. That is the company ID or the Realm ID.
  22. Open a new tab and go to the OAuth playground
  23. Go back to the developer.intuit.com tab
  24. Grab your app token (looks like b3197323bda36333b4b5fb17774440fe34d6)
  25. Go to the OAuth playground tab and put your app token in the proper field (called ‘App Token’). You’ll also want to have that later, so note it now.
  26. Click ‘Get Key and Secret using App Token’
  27. Note the consumer key and consumer secret, you’ll need them later.
  28. Click ‘Get Request Token Using Key and Secret’
  29. Click ‘Authorize Request Token’
  30. You should see a message like ‘testapp3 would like to access your Intuit company data’
  31. Click ‘Authorize’
  32. You should see a message like ‘You are securely connected to testapp3’
  33. Click ‘Return to TestApp3’
  34. Scroll down to the bottom, and you should see entries in the ‘Access Token’ and ‘Access Token Secret’ fields. Copy those, as you’ll need them later as well.
  35. Go to the SDKs page of developer.intuit.com
  36. Pick your language of choice, and follow the installation instructions.
  37. Follow the instructions in the ‘Data Service APIs’ section about setting up your environment. For Java, you’ll need to pull a few jar files into your classpath. Here’s my list of jar files in my Eclipse build path: ipp-java-qbapihelper-1.2.0-jar-with-dependencies.jar, ipp-v3-java-devkit-2.0.3-jar-with-dependencies.jar
  38. Write and run a class (like the one below) that runs a query, plugging in the six variables the values you captured above.
import static com.intuit.ipp.query.GenerateQuery.$;
import static com.intuit.ipp.query.GenerateQuery.select;

import com.intuit.ipp.core.Context;
import com.intuit.ipp.core.ServiceType;
import com.intuit.ipp.data.Customer;
import com.intuit.ipp.exception.FMSException;
import com.intuit.ipp.query.GenerateQuery;
import com.intuit.ipp.security.OAuthAuthorizer;
import com.intuit.ipp.services.DataService;
import com.intuit.ipp.services.QueryResult;

public class TestQBO {

	public static void main(String[] args) throws FMSException {
		String consumerKey = "...";
		String consumerSecret = "...";
		String accessToken = "...";
		String accessTokenSecret = "...";
		String appToken = "...";
		String companyId = "...";
		
		OAuthAuthorizer oauth = new OAuthAuthorizer(consumerKey, consumerSecret, accessToken, accessTokenSecret);           
		
		Context context = new Context(oauth, appToken, ServiceType.QBO, companyId);
		
		DataService service = new DataService(context);
		
		Customer customer = GenerateQuery.createQueryEntity(Customer.class);
		
		String query = select($(customer.getId()), $(customer.getGivenName())).generate();
		
		QueryResult queryResult = service.executeQuery(query);
	
		System.out.println("from query: "+((Customer)queryResult.getEntities().get(0)).getGivenName());      
	}
}

This code gets the first name and id of the first customer in your database, and prints it to stdout. Obviously just a starting point.

I am also not sure how long the accessToken and accessTokenSecret are good for, but this will actually give you command line access to your QBO data.

(Of course, I could have just used zapier, but that has its own limitations (limited ability to query data in an adhoc manner being the primary one).



© Moore Consulting, 2003-2017 +