Skip to content

Giving books as gifts

book photo
Photo by shutterhacks

A few years ago one of my contracting clients gave me a book for a Christmas gift. It was a thoughtful gesture. It mattered all the more that I received it from a client who I’d spent a lot of time working for and who I know had spent a lot of time thinking about some of the issues the book raised. I read the book, and applied its lessons to my life, both business and personal.

Since then, I’ve given and received books to business colleagues and clients. Books are a unique gift because they convey a message in a way that few other gifts do. A book is a more committing gift than a visa gift card or a piece of electronic equipment (both gifts I’ve also received and was thankful for). You are asking for the receiver’s time and focus, and making a statement that the gift is worthy of both of those. You’re also making the statement that the recipient will benefit from the book. It’s a gift that actually has some demands (a bit like giving a gym membership, without as much judgement).

When you are giving a book, you’re giving a new idea. You are saying you were touched or changed by this book, and you hope that the recipient will find the same idea. You also need to understand the receiver–will they find the same benefit? Have they already tuned into the ideas so it will deepen their understanding? Is this a new perspective for them?

Bonus! If you set giving worthy books as a goal, it will make you read more books. It also has the beneficial side effect of forcing you to evaluate books you read from the perspective of ‘would I want to give this as a gift?’. Not every book has to meet that criteria, just as not every meal should be broccoli and yogurt, but it is a useful filter if you have limited reading time.

I hope I’ve convinced you to try giving books to your clients, colleagues, bosses and employees. The easiest way to start is to find a solid business book that you’ve enjoyed reading (‘Getting to Yes’ and ‘The Personal MBA’ are two candidates) and sending it to someone you have worked with for a while. Then, start a list and add any other books that resonate with you. When the holidays come around, you’ll have a list to choose from. Start small because of this higher bar this sets–you’re not just asking for a ‘Thank You’ and a place on a shelf, you’re asking for time and focus.  And those are precious.

Content generation for employee acquisition

interview photo
Photo by MattHurst

I was brainstorming the other day and thought of an add on service for recruiters. My explication of this service is focused on tech companies hiring engineers, but could easily be modified for any organization that is trying to find high value employees that are difficult to hire (high performing real estate agents, sales people, financial advisers, etc).

If your company is doing interesting things–solving tough problems or using interesting technologies–potential employees are very likely interested in your activities. How, though, will they find out about the cool problems your company is solving? Well, there’s your company website, back channel communications through professional networks, or presentations at conferences or meetups. But the traditional website serves many masters (including converting ‘customers who will pay you’), and content may be hard to generate or place. Professional networks often scale poorly, depending on where you are and what sector you are in. And presentations or meetups are rather scattershot and time consuming.

Enter the company tech blog. Write about solving interesting problems. Because the blog is tech focused, you avoid the issues with the traditional company website. (Of course, don’t write about any trade secrets or unprotected intellectual property, but in my experience a large chunk of the engineering problems solved at any company are scaffolding, not core knowledge.) It also scales, because the content is write once, recruit for years (not forever, because eventually the problems you discuss won’t be interesting). Basically, any argument in favor of content generation for customer acquisition can be applied to content generation for employee acquisition. So, set up that company blog and have your engineers start writing blog posts about interesting work they are doing. Soon, you’ll rank in Google and/or have Hacker News fame (see 42 floors).

Wait, what? You say that most engineers who are competent to write these articles either have no time, no interest, no ability or some combination of all three? I’ve seen many many company tech blogs that start off strong and interesting and then slowly fade away. This is more prevalent since other information dissemination platforms (your twitters, your facebooks, what have you) have proliferated, but for whatever reason, the key to a content generation strategy is to keep at it.

And that is where my idea shines. As a value add service for a recruiter, hire reporters to interview engineers. Have the reporters write the article, with engineer review to make sure it is correct, and have both on the byline (or use ‘as told to’). An interview about an interesting tech problem will probably take about an hour, and you can expect an hour for review, so you still have to carve out two hours from your engineer. If you have a team of ten engineers, and half are willing to be interviewed, that is less than two hours a month for a weekly blog post. Of course, you have to pay the reporter for more than two hours, but reporters are less expensive than engineers. Sure, this is an extra cost, but the article will be published. And the next one will get done. And eventually, the company will have a recruiting site working for you. Hard problems aren’t everything for engineers, but they count for a lot.

I mentioned this business plan to a friend and his feedback was–“seems like a great idea, but couldn’t an intern and a junior marketing person do this”? I think so, so I’d love to see more companies doing this! Hire that intern and that marketing person and start blogging about the hard problems you have faced and solved! However, maybe you outsource your recruiting efforts. If so, ask your recruiter about their content generation strategy.

If you are a recruiter, consider offering this as a value add service. (Eventually you may work yourself out of a job if your only value add is finding people, but good recruiters I’ve talked to offer more than just finding people–they screen them, make sure they are a culture fit, help the candidate through the process, and more.)

Do you know any companies or recruiters that are doing this? Do I have a fatal flaw in my idea? Let me know.

Running a brown bag lunch series in your office

Courtesy of smoothfluid
Courtesy of maxually

Brown bag lunches are great opportunities for employees to share their knowledge, learn new skills, and bring a small company together.  By ‘brown bag lunch’, I mean an internal presentation lasting about an hour, made by an employee on an interesting topic of their choice.  The name comes from everyone bringing their lunch to work on that day, rather than eating out.

8z has been doing them for over two years, and here are some lessons.

  • Schedule them monthly, and one mont at a time.  Don’t try to schedule out the whole year.
  • Have presenters spend as little time as possible building a powerpoint.  It’s hard to get away from them as a structural crutch, but they don’t really add a lot of value.
  • Bring in real business situations.  One of the most memorable presentations occurred when presenters analyzed a recorded call during the presenation.
  • Have someone be point and recruit people individually.  Don’t count on volunteers, especially at the beginning.
  • It’s OK to miss a month or two if other stuff is going on.  Hello December.
  • Record them if you can.  All you need is an ipad and a youtube account.
  • Technical presentations (like application architecture) are appreciated by the business folks.
  • Everyone has something to say.
  • You can have people repeat every six months or so.
  • Some people won’t want to speak.
  • Presenting in pairs can work.
  • Make sure the presenter leaves plenty of time for Q&A.  8z budgets an hour for the talk and Q&A.
  • Schedule it so founders/executives attend.  This makes a powerful statement and exposes them to direct ideas.
  • Be prepared to capture changes/feedback from the presentation.
  • The departmental cross pollination is a major benefit.
  • Consider themed potlucks (mexican, breakfast for lunch, etc) instead of brown bag lunches.

How do you spread knowledge within your small company?

Building an automated postcard mailing system with Lob and Zapier

Courtesy of smoothfluid
Courtesy of smoothfluid

I was looking at automated paper mailing systems recently (and listed what I found), and was especially impressed with Lob, especially the ease of its API.

Among other printing services, Lob will let you mail a postcard with a custom PDF on both sides, or a custom PDF on one side and a text message on the other, anywhere in the USA for $0.94.  (Sorry, not sure about international postcards) The company for which I work sends out tens of thousands of postcards every quarter. The vendor which we use charges them a similar fee (less, but in the same ballpark) but there’s a manual process to deliver the collateral and no API. So an on-demand, one by one post card sending system is very interesting to me.

Note that I haven’t received the Lob postcard which I sent myself, so I can’t speak to quality. Yet.

The Lob API is a bit weird, because the request is form encoded rather than a JSON payload.  It also uses basic auth, but only the username, not the password. But the API seems to have all the pieces you’d need to generate all kinds of postcards–reminder postcards, direct mail postcards, photo postcards, etc.

After testing out the service via the web interface and cURL examples, I thought that it’d be fun to build a Zapier zap. In particular, being able to send a postcard for an entry in a Google spreadsheet seemed like a useful use case. Plus, Zapier is awesome, and I’d wanted to test out their integration environment for myself.

So, I built a Zapier integration for Lob, using the Zapier developer docs in combination with the Lob developer docs. It was actually easy. The most complicated step was translating the Zapier action data, which is a one or two dimensional array of typed data, into the Lob data format, which wanted a couple of text fields and two address arrays. Zapier has a scripting environment that let me modify data from APIs pre and post send, and even had an example about form encoded APIs. Zapier’s JavaScript scripting development environment was full featured, including syntax and error highlighting. It had no real debugging available, but I could use the venerable debug-by-log-statement method fairly easily.

Where could I take this next? Everywhere people use postcards in real life. The postcards depend on PDF files (see a sample), so if you are generating a custom postcard for each interaction things become more complex, but there are a few APIs (based on a 30 second google search, here and here) available for dynamic PDF generation. There are also limits on API call throughput, if I stuck to the Zapier integration–I could send at most 300 postcards a day, unless I managed multiple spreadsheets.

I see reminders of high value events (dentist, house maintenance, etc), contests and marketing as key opportunities for this type of service. And if I had a product where direct mail was a key component, using Lob directly would be worth serious consideration.

Regarding the Zap, I believe I cannot make this Zap available to anyone else. Since I’m not a representative of Lob, I couldn’t commit to maintaining this Zap, and Zapier doesn’t want to have any of their customers depending on an integration that could disappear or be unsupported at any time–a fair position.

If the Zapier or Lob folks want to take this integration and run with it, I’d be happy to share my code–just leave a comment. If anyone else is interested in being able to generate Lob postcards from a Google spreadsheet (or any other compatible API) via Zapier integration, let me know and I’ll see what I can do.

Google Spreadsheet Custom Functions With Spreadsheet Based Configuration

The business for which I work, 8z Real Estate, runs on Google spreadsheets–they are everywhere, and are especially powerful when combined with Google Forms. (Except in the tech department–we use more specialized tools like wikis and bug trackers.)

Recently I was cleaning up one of these spreadsheets and making it more efficient. This spreadsheet was an ongoing list of items that should be charged to various real estate agents. There was a fairly clear set of rules (person A should be charged this for item B, but not for item C, and person D should not be charged for items E, but should for item F). The rules were also fairly constant–not a lot of change. While they were clear, there were some intricacies that had tripped up some folks. And since there was real money involved, there was often a lot of time expended to make sure the charges were correct.

For all these reasons, it made sense to automate the charge calculations. Given the data was already in a Google spreadsheet, I decided a custom function was the best way to do this. The custom function could read from a configuration tab in the same spreadsheet with a list of people and items which would represent the charge rules. In this way, if there was a new item, a new row could be added to the configuration spreadsheet without requiring any help from a developer.

I was able to write the functions fairly quickly, using QUnit for Google Apps Script. I can’t recommend using QUnit highly enough–developing in Google Apps Script combines the joys of javascript (with its … intrinsic difficulties) and a remote execution environment that can be tough to debug. So, unit test your Apps Script code!

The initial implementation pulled data from the configuration tab with each custom function call. This naive implementation worked fine up to a couple of hundred rows. However, eventually this caused Exceeded maximum execution time errors. I believe this is because when the spreadsheet was calculating all the charge values, it was accessing the configuration spreadsheet range hundreds of times a second.

My next step was to try to cache the configuration data. I used stringified JSON stored in the cache service. Unfortunately, this caused a different issue: Service invoked too many times in a short time: cacheService rateMax. Try Utilities.sleep(1000) between calls.

Third time is the charm: have the function return multiple values. The configuration data is only read once, as is the list of items and names. These are iterated and the charges for all of them are calculated and returned in a double array to fill entire columns. Using this technique avoided the above issues, but created another one. Adding new rows with new items wouldn’t update the charges columns. Apparently there is additional, ill documented caching of custom function values. Luckily StackOverflow had an explanation: spreadsheets “evaluate your [custom] functions only when a parameter changes.” The way I worked around this was to pass a parameter to the custom function of the number of non blank rows of the spreadsheet: =calculateCosts(counta(A:A)). When a new row is added, the custom function is re-evaluated.

In the end I have written unit tested code that works in the way the business wants to work (in Google Spreadsheets), that runs on someone else’s infrastructure (Google’s), that can be configured by non technical employees and that will increase the accuracy of the charge calculations. Wahoo!

Observations on a Writing a Custom Report with Java, Quickbooks, Jasper Reports, Google Spreadsheets and Google Drive

Image courtesy of Sean MacEntee: https://flic.kr/p/9P4CwC
Image courtesy of Sean MacEntee

A recently released project is using java and spring to pull data from quickbooks, a mysql database and google spreadsheets, munging the data in various ways, and using jasper reports and jfreechart to generate a good looking report and a CSV of transactions that will give our brokers weekly updates on how they are doing compared to their goals for the year. I then upload it to Google Drive and send an email notifying each realtor that they have a new file.  It’s always nice to release useful software–feels like a new day dawns.

A few observations from this project:

  • The tech was interesting, but it was actually more interesting to see how the needs of tech drove the business to ‘tighten up’ their processes. Whether that was making sure there was one place to look for user type data, or defining exactly what constituted achieving a goal, or making sure that any new realtors who joined created business goals (and committed them to writing), the binary nature of software forced the business (or, more accurately, people in the business) to make decisions. Just another example of business process crystallization. This also meant deferring some software development. Where the business couldn’t answer a question definitively, rather than force them to do so, we chose to defer development.
  • I’m glad that jasper reports makes it so easy to generate PDFs–you basically create an XML file (I was unable to find a spec, but there were plentiful examples) and then put tokens for dynamic content. Then you compile the XML file, give it a map of said tokens and values (which can be text, numbers, dates or images), and then export the object to PDF. Note that I was not using Jasper in a typical way–reporting from large amounts of similar data via a data connection–because I wanted different PDFs for each user. Perhaps there was a way to do this elegantly, but I was just trying to get stuff done. Creating a grid in jasper was interesting, though.
  • JFreechart had a very weird issue where on stage the graph labels were bolded and italicized, but not on production. Since we make every effort to keep these two environments in sync and they were running exactly the same code, this was a mystery. Finally solved it when we discovered java was different (same version, different vendors: openjdk vs sun java). Had been running this way for years. Oops.
  • Interacting with google spreadsheets is great for the business but a pain in the butt for developers. It’s great for our business because it is extremely easy for someone who is not a programmer to create a ‘database’ that is versioned, backed up, almost always accessible and helps them ‘get stuff done’ in a measured way. It’s a pain for developers because it is a ‘database’ and not a database–no referential integrity or data typing. Also, google provides cell based access and row based access, forcing you to choose. And the java libraries are old. Beats excel though–at least it is accessible from a server. We ended up writing a library to wrap Google’s Java SDK to make some common operations easier.
  • Pushing to google drive is interesting–I alluded to this in my last post but you have to be ready for failure. I ended up using a BlockingQueue for this–I throw files (a data structure defining the file, actually) to be uploaded on the queue, then consumers executing in a different thread each take one off, try to upload it, and if it fails, put it back on. I considered using a third party durable queue like IronMQ, but thought it was overkill.
  • Using the Quickbooks SDK, with all the accounting data exposed, lets you build some pretty powerful graphs that are useful to the business. But the docs are jumbled, with a lot of them aimed at developers who are building integrations to sell to Quickbooks users. Support is OK for standard operations, but for things like renewing your token, you have to drop down to the REST API (see my SO question) This article does a good job of outlining the various projects but as a dev you’ll have to ignore certain sets of information–never fun when getting up to speed.
  • We do a lot of backend processing and spring and maven and a custom assembler that generates a tarball when using ‘maven install’ have been great. I also finally figured out how to work with maven to use ‘release:prepare’ and ‘release:perform’ for releasing libraries, as opposed to going my own way, and that has made things much much easier. Learn your tools, folks!
  • I’m once again astounded by the richness of the java library ecosystem. There doesn’t seem to be very much that I can think of doing that doesn’t have at least one, and probably three, java implementations.

How to access your QuickBooks Online data via API from the command line

As is so often with this blog, I’m documenting something that took me a long time to figure out.

The problem: the company I work for has data in QuickBooks Online (QBO) that we’d like to distribute elsewhere in our organization.

The answer: QBO has an API. Yahoo! They have several different SDKs (for .NET, Java and PHP) to make access even easier. (The APIs and surrounding docs are called the Intuit Partner Platform, or IPP.)

However…

The issue is that requests to the QBO API must be authenticated with OAuth (I believe it is OAuth 1.0a). And the entire Intuit Partner Platform documentation is focused on needs of developers who are building SaaS applications that consume or augment QBO data and are accessed via the QBO webapp. Which means there’s a heavy focus on web applications and OAuth flows via web applications.

But all I wanted was a command line client that could use the API’s query interface.

I was stymied by OAuth. In particular, I couldn’t find a way to get the accessToken and accessTokenSecret. I tried a number of different tacks (I beat my head against this for the better part of day).

But I just couldn’t find a way to generate the needed tokens, with either the PHP or Java SDK clients (most of the links in this post are for the Java client, because that’s more mature–the PHP client doesn’t support JSON output or have reference documentation).

Desperate, I turned to the IPP forums, which were full of advanced questions. I did stumble on this gem: “Simple way to integrate with our Quickbooks Account”, which took me to the OAuth Playground for IPP Developers. And, voila, if you follow the steps in the playground (I used IE because FireFox failed), you will end up with a valid accessToken and accessTokenSecret.

So, with that sad story told, here’s exactly how you can take access your own QBO data via the command line (I only cover a trial account here, but believe the process is much the same with a paid account):

  1. Start your IE browser (I’m using IE 10 on windows 8)
  2. Go to https://developer.intuit.com/us
  3. Sign up (click the ‘join’ link), click ‘remember me’
  4. Go to your email and find the verify link that was sent to you. Paste it into your IE browser.
  5. Sign in again
  6. Click on ‘My Apps’
  7. Click on ‘Create New App’, then ‘QuickBooks API’
  8. Fill out the name of the app, and the other required items. You can change these all later (I think). I know you can change the URLs later.
  9. Select the level of data access you need. Since this is a test app, you can select ‘All Accounting’
  10. Click ‘Save’
  11. Open up another tab in IE and go to the QuickBooks Online site (We are just adding some dummy data here, so if you have an account, you can skip this.)
  12. Click on ‘Free Trial’
  13. Click on ‘QuickBooks Online Plus’
  14. Click on ‘Already have an Intuit user ID’
  15. Fill out the username and password you used on when you signed up for your developer account.
  16. Ignore the upsell, if any
  17. Click the customers tab
  18. Click on the ‘new customer’ button
  19. Enter a first name and last name then press save
  20. Open a new tab and go to the API Console
  21. Choose the company that you want to access, and note the number next to that name. That is the company ID or the Realm ID.
  22. Open a new tab and go to the OAuth playground
  23. Go back to the developer.intuit.com tab
  24. Grab your app token (looks like b3197323bda36333b4b5fb17774440fe34d6)
  25. Go to the OAuth playground tab and put your app token in the proper field (called ‘App Token’). You’ll also want to have that later, so note it now.
  26. Click ‘Get Key and Secret using App Token’
  27. Note the consumer key and consumer secret, you’ll need them later.
  28. Click ‘Get Request Token Using Key and Secret’
  29. Click ‘Authorize Request Token’
  30. You should see a message like ‘testapp3 would like to access your Intuit company data’
  31. Click ‘Authorize’
  32. You should see a message like ‘You are securely connected to testapp3’
  33. Click ‘Return to TestApp3’
  34. Scroll down to the bottom, and you should see entries in the ‘Access Token’ and ‘Access Token Secret’ fields. Copy those, as you’ll need them later as well.
  35. Go to the SDKs page of developer.intuit.com
  36. Pick your language of choice, and follow the installation instructions.
  37. Follow the instructions in the ‘Data Service APIs’ section about setting up your environment. For Java, you’ll need to pull a few jar files into your classpath. Here’s my list of jar files in my Eclipse build path: ipp-java-qbapihelper-1.2.0-jar-with-dependencies.jar, ipp-v3-java-devkit-2.0.3-jar-with-dependencies.jar
  38. Write and run a class (like the one below) that runs a query, plugging in the six variables the values you captured above.
import static com.intuit.ipp.query.GenerateQuery.$;
import static com.intuit.ipp.query.GenerateQuery.select;

import com.intuit.ipp.core.Context;
import com.intuit.ipp.core.ServiceType;
import com.intuit.ipp.data.Customer;
import com.intuit.ipp.exception.FMSException;
import com.intuit.ipp.query.GenerateQuery;
import com.intuit.ipp.security.OAuthAuthorizer;
import com.intuit.ipp.services.DataService;
import com.intuit.ipp.services.QueryResult;

public class TestQBO {

	public static void main(String[] args) throws FMSException {
		String consumerKey = "...";
		String consumerSecret = "...";
		String accessToken = "...";
		String accessTokenSecret = "...";
		String appToken = "...";
		String companyId = "...";
		
		OAuthAuthorizer oauth = new OAuthAuthorizer(consumerKey, consumerSecret, accessToken, accessTokenSecret);           
		
		Context context = new Context(oauth, appToken, ServiceType.QBO, companyId);
		
		DataService service = new DataService(context);
		
		Customer customer = GenerateQuery.createQueryEntity(Customer.class);
		
		String query = select($(customer.getId()), $(customer.getGivenName())).generate();
		
		QueryResult queryResult = service.executeQuery(query);
	
		System.out.println("from query: "+((Customer)queryResult.getEntities().get(0)).getGivenName());      
	}
}

This code gets the first name and id of the first customer in your database, and prints it to stdout. Obviously just a starting point.

I am also not sure how long the accessToken and accessTokenSecret are good for, but this will actually give you command line access to your QBO data.

(Of course, I could have just used zapier, but that has its own limitations (limited ability to query data in an adhoc manner being the primary one).

Companies to come out of XOR

I read Startup Communities by Brad Feld a few months ago. I found it to be interesting even for me–someone who is only on the periphery of the VC/startup community in Boulder. I especially enjoyed his first chapter, where he examined the startup history of Boulder, from StorageTek to Celestial Seasonings.

I cut my teeth working as an employee of a startup in Boulder, XOR. We were a consulting company, and I was able to watch, fresh out of college and wet behind the ears, as we went from a small profitable company of 60 to a VC funded agglomeration of 500 employees spread across the country, and through some of the layoffs and consolidation.

I was talking to another XOR employee who co-founded the company I currently work for about companies that spun out of XOR, and thought it’d be fun to collect a list.

To make this list, you have to meet the following criteria:

  • founded by someone who worked at XOR
  • had at least one employee or two founders–I started a sole proprietorship myself, but it is hard to distinguish between freelancing (which is hard, but not as hard as a company) and a one person company

To make the list, a company does not have to still be alive nor profitable–I’m interested in the failures as well as the successes. In addition, it doesn’t matter if the founding happened a few years or jobs after the person worked at XOR–again, I’m interested in lineage, not in direct causation.

Here are the companies I know (including XOR founders where known–there may have been other founders not listed).  In no particular order…

If you know one that is not listed, please contact me and I’ll add your suggestion.

My ODesk experience

A few months ago, I had a friend who mentioned that he was investigating ODesk to find help for his software project.  I’d heard of ODesk before and was immediately interested.  I have a directory of Colorado farm shares which requires a lot of data entry in the early months of each year, as I get updated information from farmers.  So, I thought I’d try ODesk and see if someone could help with this task.

Because this was my first time, I was cautious.  I worked with only one contractor, and only used about 17 hours of her time.  We worked off and on for about 3 months.  She was based in the Philippines, so everything was asynchronous.  We communicated only through the ODesk interface (which was not very good for a messaging system).

I chose her based on her hourly rate (pretty cheap), skillset (data entry) and reviews (very good).  I made the mistake of contacting her to apply for the job, but letting others apply as well, and in the space of 3 days had over 90 applicants for the position.

After selecting her, and her accepting my offer, I created a data entry account, and described what I wanted.  This was actually good for me, as it forced me to spell out in detail how to add, update or remove an entry, which is the start of the operations manual for my site.

After that, I managed each task I’d assigned to her through a separate email thread.  I did light review of her work after she told me she was done updating, and did go back and forth a couple of times over some of the tasks.  In general, she was very good at following instructions, and OK at taking some initiative (updating fields beyond what I specified, for example).  There were some bugs in my webapp that caused her some grief, and some updates I just did myself, as it was quicker to do them than to describe how to do them.

The variety of work you can get done via ODesk is crazy, and the overall ODesk process was very easy.  You just need to have a valid credit card.  If you are looking to start on a project right away, be aware that some lead time is required (they charge your card $10 to validate your account, and that takes some time to process).

Even though it didn’t save me a ton of time, it was a useful experiment and I’ll do it again next year.  For simple tasks that can be easily documented and outsourced, it’s a worthwhile option to explore.  Though be careful you don’t outsource too much!

RSS email campaigns

Email is one of the best ways to keep users coming back to your site.  I’d lay out the arguments, but Patrick McKenzie already has.  Go read that article if you doubt my statement.

If you generate a lot of content via your blog, then an email campaign that pulls from your RSS feeds is a great way to generate newsletter content.  If you have two sources of content–a content blog and a link blog, for example–you can have the newsletters pull from both RSS feeds and have them feed different sections of your newsletter.

I am a big fan of mailchimp as an email delivery service (they have a great free plan I’ve used multiple times), and was ready to roll up my sleeves and use their API to read from a set of RSS feeds and generate a newsletter from that feed.  But, luckily, they’ve already implemented RSS email campaigns.  If you want a monthly newsletter and were confident of new articles in your RSS feeds, you can schedule the newletters out for a year, and know that you’d be sending out topical fresh content every month.  (Note that if you didn’t have new articles, the newsletter would still be sent if you didn’t intervene.  And I can’t think of something that would make an internet company look dumber than to send the same content twice in an email newsletter.)

I’m aware of solutions like paper.li that allow you to aggregate multiple sources of content in a pretty package, but mailchimp seems to give you more control and can be used for non RSS email campaigns as well.

Take a look!