Author Archives: Andy Gordon

Home Video Editing and Hardware

 

Getting your home videos off of the camera and onto your nice new HDTV can be a challenge if your computer and its accessories are not up to the task. This article tells you what hardware you will need to get your home video off of your video camera, into your computer, and onto a standard or high definition DVD.

Getting the Video into Your Computer: The Video Capture Card

One of the first computer accessories that you may need to move the video from your camera to your computer is a capture card. While most newer camcorders have the ability to plug directly into your computer’s USB or Firewire ports, older camcorders (especially VHS and VHS-C size camcorders) do not. If your camcorder does not have such connection ports then you will have to purchase a capture card that supports analog input. You would install the capture card according to the hardware’s instructions, and then plug standard red/white/yellow audio/video cable from the old camcorder to the capture card. Using the software provided with the card you will then be able to download the video into your computer, edit it, add titles, and then burn it out to a DVD. Be aware that the word “capture card,” may be misleading. Many companies such as Dazzle make capture cards that also plug into a standard USB port.

Storing Video on Your Computer Requires a Big Hard Drive

Another important piece of hardware for editing your home videos is your computer’s hard drive. Video takes up large amounts of space so make sure you have at least twenty gigabytes left before beginning any project. If you don’t, consider going out and upgrading or adding an additional hard drive to your system. When it comes to video editing speed is very important, so look for hard drives with very fast data transfer ratings. To get the most out of your investment you may even need to add a new SATA card to your computer to connect the faster style Serial-ATA hard drives.

Burning Your Completed Home Video to HD or Standard DVD

Aside from the video editing software, the last thing you need on your video editing system is a DVD burner. New drives by some companies such as the LG GGW-H10N can burn both HD-DVD and Blu-Ray High Definition discs. If you are using a high definition camcorder and want to preserve that great resolution you will want to consider saving up for either the LG or for another brand of drive that will burn in the high definition format of your choice. If high definition is not an issue for you then you can get by with almost any standard DVD-R drive.

Getting started with transferring your home videos to DVD is a little daunting at first, especially if your hardware is not up to the challenge. Consider the tips in this article, and do your research before settling on any specific pieces of hardware.

If you want to know more about motion graphics, visual effects and editing, visit VFX Los Angeles.

How to get a Good Google Listing

Getting a good listing on Google is pivotal to your website’s success. A few tips on how to go about getting your Google listing higher up the rankings.

Getting a good Google listing for your website isn’t nearly as hard as people make it out to be. You simply contact Google’s World Domination Department, pledge your undying loyalty and support, sign over rights to whichever of your children tests as the brightest and allow certain “testing” on your vital organs and…

Okay, I’m kidding. It’s way harder than that to get a good Google listing.

Get a Good Google Listing: What Makes for a Top Ranking?

If it were just one thing that would give you a top ranking in Google, then it would be easy and everyone would do it. This would, of course, result in everyone having a high ranking, which is quite the conundrum.

To begin to understand how to approach getting a good ranking in Google, it’s best to go right to the horse’s mouth. At Google Webmaster Central there an article, a couple of paragraphs long, called simply, “Ranking”. It gives one specific method to increase your ranking, which since it is coming directly from Google, it might be good advice to heed: “In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages.”

Getting good inbound links is key. Okay, so how do you go about that? Glad you asked:

Get a Good Google Listing: Get Quality Inbound Links

The best method for getting high-quality inbound links is to have excellent content on your site. If your website is informative and offers something attractive to others, then your inbound links will naturally grow as other sites find you and link to you.

Outside of that, you can exchange links with sites of a similar niche. If your site is brand new, more established sites may not feel they will receive equitable pop from Google by linking to your site, but if your content is good, they may trade with you anyway.

You can pay for links, but if you do, be careful that they are actually good quality. What that means is, you should only pay for a link on a site that is similar to yours in its niche, and that isn’t a junk site. Don’t put your link on a place that is just a series of links or otherwise doesn’t offer any good, original content.

Get a Good Google Listing: Domain Name, Titles and Page Names

It is very important that your domain name be reflective of the words someone would use in a search engine to find your site. If you want to sell widgets and crackers and your name is Bob, then bobsplace.com would be a lousy domain name. However, widgetsandcrackensbybob.com, or better yet widgetsandcrackens.com, would be perfect. People looking for your product aren’t searching for Bob, they are searching for widgets and crackers.

Make sure you title your pages in the same manner. The title should contain the keywords likely used to find your product or service. If you write blog pages, make sure the title of the blog posting incorporates itself into the web address. For example www.widgetsandcrackens.com/how-to-split-a-widget-or-cracken

You can actually omit the throw-away words, such as “a” “to” and “or”, but make sure those pivotal keywords are in the URL.

Getting a good Google listing is manageable; it just takes time, effort and study. Start with these tips and you’ll be on your way up Google’s rankings.

Conversion Optimization: Tracking Form Validation Errors with Google Analytics

After all the effort of getting your users interested in your product/service, the last thing you want is for them to exit your site because form validations were an afterthought.

Losing Conversions from Indian Street Addresses

After all the effort of getting your users interested in your product/service, the last thing you want is for them to exit your site because form validations were an afterthought.

For the last few weeks at Learnhub I’ve been trying to optimize the conversion rate of our school application form. This form is lengthy compared to most and we required that potential students entered their home address.

india_address1
As an experiment, we hooked up Google Analytics to track every time a validation error happened.

We were surprised to discovered that 20% of users failed to enter their street address properly and half of those users then exited the site. This was a big warning sign that our validations need improvement.

20_graph1

So we began to look into why this is happening in more detail.

We realized that our indian users were skipping the address not because they didn’t want to share it but because Indian addresses are really complicated.

In India, especially in smaller towns, street address’s are not as established as other parts of the world. If they did know it, it frequently looked like this: 83, LAXMI APPT., SEC-5, PLOT NO-27/8, ROHINI.

Asking someone to type that out is a usability nightmare.

From this data we now had a new starting point for improving conversions: by making the process of entering address easier or by making the field optional.

How Did We Track Validation Errors with Google Analytics?

In Google Analytics they have an awesome feature called event tracking that can be easily trigger by on-page javascript.

Our site was developed with Rails so when a field fails to validate, it automatically gets wrapped in a div.

<div class="fieldWithErrors">
    <input id="question_8_street_address" name="question_8[street_address]" type="text" value="" />
</div>

So we wrote up a tiny script that:

  1. scans the page for any divs with fieldWithErrors
  2. grabs the ID of the form field

sends an event to Google Analytics with the label “Validation Error” and the value as the fields ID

The script (prototype):

 $$('div.fieldWithErrors').each(function(field) {
   pageTracker._trackEvent('Form', 'Validation Error', field.down().identify());
 });

With this data you can see see how many exited the form, what country they are from, validations per user, etc.

Fixing the Problem Fields

It may be beneficial to minimize the required fields to get that initial commitment.

Just like the old sales adage, if you can get the customer to say yes the first time it will be easier to get them to say yes later on for the bigger commitment.

Making fields a requirement is always a tough balance between hurting the forms usability and getting the information you want.

If you do decided to skip the tough questions early on, a process could be set up to get the needed information later on from something like a follow-up email or secondary form.

Either way it helps to have the analytics data to back it up those decisions.

Optimizing Retail Websites for Maximum Search Rankings

Getting a website to rank on the first page of search listings is critical in a site’s ability to be seen. Here’s how to get maximum ranking every time.

Getting a website to rank on the first page of a search, and ideally, within the top three listings, is not just important to a retailer, it’s critical. “Very few searchers will look past the first page,” says Matt Kain, chief revenue officer at The Search Agency. “Even on the first page, the lion’s share of the clicks — something like 40 percent — goes to the No. 1 result, and the difference between positions 1 and 2 could be a significant drop-off.”

Preparing a Site for the Highest Possible Return

One expert recommends starting with a complete site and searches audit. “We look at architecture, page construction, content, link popularity, and web server configuration,” explains John McCarthy, director of SEO for digital agency WebMetro. They then prioritize areas for improvement by ROI.

Rob Garner, senior strategy director for digital agency iCrossing, says that the best time to optimize a website for search is when it’s being built, because “it can cost 10 times as much to re-engineer later.” He recommends staying away from rich media applications such as Flash and Ajax that make it difficult to rework tags after the fact.

The first step to gaining maximum search engine exposure is to ensure that all content is indexed. “Meta tags” are a description of the content that includes a title, author, keywords and an explanatory sentence, among other information. Very often, when a site comes up in a search result, it is the meta tag description that appears, so it is critical that meta tags be precise and informative, Garner says.

Kain adds that the most important thing to consider with meta tags is the title and of that, the first few words.

Also important are “alt attribute” tags. If a person visiting a webpage has images turned off, the alt attribute tag allows them to see a text description of the picture — i.e., “iPhone case, black leather, flip top”, rather than an empty box where a graphic should be. Alt tags, Kain says, as well as the directory file path for the image itself should reinforce keywords and phrases the page is relevant for.

Improving Site Rankings

Site rankings can be improved by writing detailed and descriptive content. “A site will never rank on page one for a broad category like ‘high heels’,” Garner says. A description such as ‘dark brown peek-toed pumps’ has a better chance of generating a higher ranking. Google’s Website Optimizer is helpful for testing keywords to see how they will rank.

Also, the content should look natural. Rather than repeating words, add unique content such as an editor’s product review or links to sites with related information, such as health and fitness articles next to running shoes. Search engines consider all of that content in determining rankings. In addition, graphics should be a label with the same level of detail, rather than “image1.jpg”.

McCarthy recommends ‘latent semantic indexing’, which means using other words to help define a primary keyword phrase. For example, “If I talk about shoes, the search engines don’t know what kind. If I mention stirrups and a saddle, you know I’m talking about horseshoes; Mustangs and Raybestos, you know I’m talking about brake shoes; with Nike and Puma, ’m talking about running shoes.”

Finally, pages can only be optimized for so many keywords per page. A page is not going to rank first for both “cellphone case” and “squirt guns”. Choose the one with the highest value.

Next Steps

After optimizing content, there are two additional strategies worth considering. The first is uploading content to Google Base. Google Base is a free database that helps search engines find content. It accepts both XML and Excel spreadsheets for one listing or thousands. Yahoo! offers a similar product.

Another key to elevating search rank is securing links from suppliers, partners and other sites, such as enthusiast groups or related events.

How We Improved Our Conversion Rate by 72%

One of my favourite parts of creating web apps is being able to make subtle changes to our sales pages and see the impact it has on our signups.

With CareLogger being a side project, our time available for marketing is limited. We found conversion optimization to be a good way to spend 30 minutes once a week refining our pitch to customers. It also maximizes the number of signups so we have access to more people to conduct customer development with.

After these 3 experiments, our free signup conversion rate went from 14.5% to 25%, a 72% improvement.

1) Including a pain point in our headline

When we launched the site our headline said Keeping Tabs on Your Diabetes Just Got A Lot Easier:

While this explained in very general terms a benefit of our product it didn’t hit on the real need that people had when they decided to keep a logbook to record their diabetes. Making your life easier is what almost every product promises.

People weren’t looking for our software because it was easier than what they currently do. They wanted better insight into their illness so they can stay as healthy as possible.

So we changed our message to Maintain Your Optimal Health by Keeping Tabs on Your Diabetes. We also highlighted the most important benefit “Optimal Health” with green.

This change resulted in a 31% increase in conversion after 1000 trials (our metric was signing up). Not bad at all.

2) Changing our signup button from Green to Red

Earlier this week I came across an article by Performable that explained how changing their call-to-action from green to red increased conversion by 21%.

I had to try it out so that day I set up an A/B test on our homepage call-to-action.

So far we’ve had 600 participants and our conversion rate has increased by 34%.

We generally wait until 1000 trials but so far the results have been pretty significant.

We believe that it was so effective because we used green on many other parts of the landing page and the red just stands out so much more. It’s all about contrast.

3) Changing our button text from “Signup for Free” to “Get Started Now”

We also experimented with changing the button text on the call to action to form “Signup for Free” to “Get Started Now”. This one had a smaller effect, after 1000 trials our conversion rate improved by 7%.

The difference in this one is that “Get Started Now” is an easier sounding commitment than signing up. Signing up also has connotations with paying (our app is free).

Next up we’re going to try alternating the stock photo on the homepage from a male/female couple to a doctor and patient.

10 Inspiring SaaS Website Designs

I have been researching SaaS companies recently and wanted to a share few well-designed sites I’ve had bookmarked. These sites are usually the first thing their customers sees, but many companies I came across overlooked the design of their sites so here are a few that really stood out.


Site: SquareSpace

Why: It looks like the design of SquareSpace is it’s the biggest selling point. The site has an intro video front and center with large screenshots of the software taking up most of the page. That doesn’t seem to be a bad thing either. (I like dark designs for some reason).
Best Feature: Pricing page, it’s very simple and focuses on customer objections at the bottom.


Site: Vertical Response

Why: It takes about 3 seconds to figure out what Vertical response is. Too many SaaS sites are ambiguous when it comes to explaining what their product does. The copy on the site is interesting and gets to the point.
Best Feature: Landing page right at the beginning to converts visitors to leads.


Site: Jive Software

Why: About a month ago the home page of Jives site mainly had a pointless 3d animation of their logo and little information. They recently updated the home page to focus back on the products, which is a nice improvement.
Best Feature: Customers page, nothing builds trust better then a list of well-known customers and Jive has no shortage of them.


Site: Rival Map

Why: They are selling a unique type of software that they communicate well. There is very little clutter and they don’t try to oversell.
Best Feature: Product Tour, they offer a simple walkthrough of all the features with a nice video.


Site: Campaign Monitor

Why: The colorful front page stands out and should appeal to their target market, web designers. They do a good job of communicating the simplicity of the product and pricing over the existing email marketing companies.
Best Feature: Content Focus, everything on the site is focused on its target market from its home page to the articles helping developers sell email services to clients.


Site: Helpstream

Why: Although not the most usable site ever created, the design is unique and memorable.
Best Feature: 3d-Chart, I have seen this type of chart used before, but Helpstream did it well.


Site: Taleo

Why: They sell multiple complex products to the enterprise market. Naturally, a lot of the site is content heavy but it is presented in a welcoming way. What I found interesting was how they divided content into the different areas.
Best Feature: Usage stats on the front page. With 2.5 million users it’s clearly proven there is value in the software.


Site: SuccessFactors

Why: Another site with a customer base it’s not afraid to promote. The product descriptions are also clear and well written. No endless feature lists or lengthy copy.
Best Feature: Testimonial Videos, gives their customers a face, not just a logo.


Site: Zuora

Why: They focus on the most positive aspect of the company: the founding team behind the software. This builds credibility and focuses less on the product, which is too new and unproven.
Best Feature: About Page, they clearly show the quality of the management team.


Site: Zendesk

Why: The first thing that stands out with Zen Desks site is the pricing is very prevalent on the home page. Too many business software sites hide their pricing deep in the site or require your phone number to have a sales rep call. Zen Desk is not afraid to show the price, which helps customers compare it to other products and figure out whether or not it is suited for their business.
Best Feature: Customer Testimonials speak for themselves.

How to Scrape Websites in Ruby on Rails using scRUBYt!

After launching Contrastream I’ve always wanted to create a script that would automatically add new albums to the site as they were released (saving me a lot of time). The idea was to scrape a bunch of album titles from review sites and combine it with information from the Last.fm.

The task seemed daunting at the time, so I never jumped in and tried it. After a year I finally wrote my first script and realized how wrong I was. Scraping sites with ruby is surprisingly easy.

For my first project, I decided to try the simple task of pulling product information from homedepot.com. Here’s how I did it after some trial and error. See the completed script: http://www.pastie.org/267676

The Plan

  • Go to http://homedepot.com
  • Search for “Hoover Vacuums”
  • Find the First Product in the Results
  • Click Product Details Link
  • Fetch Name + Description
  • Repeat for Each Product in the Results
  • Save Data to MySQL

What You Need

This is purely a ruby script but I used Rails to save the data using Mass Assignment + ActiveRecord (MySQL).

  1. Firefox + Firebug + XPath Checker
  2. Set up basic Rails application framework
  3. Install scRUBYt! plugin (sudo gem install scrubyt –include-dependencies )
  4. Set up the database.yml file
  5. Create homedepot.rb in [rails_root]/scripts folder

Now the fun part begins…

Fetching the site

Add this to the top of the script.

	require '.ubygems' 
	require 'scrubyt' 
	Scrubyt.logger = Scrubyt::Logger.new 
	product_data = Scrubyt:: Extractor.define do 
 
	  fetch 'http://www.homedepot.com/'

These are the basic scRUBYt includes plus the rails environment so we can use ActiveRecord. After that you need to direct the script to the site your scraping (called fetching).

Searching the site

Now that the page is loaded up, it’s time to search the site.

fill_textfield 'keyword', 'hoover vacuums' 
		  submit

I highlighted the search input box in Firefox and viewed source. I found that the input name was “keyword”. The second string is the search query “hoover vacuums”.

Find the First Product in the Results (Create a Pattern)

The search results page shows a list of Hoover products. The next step is to create a pattern that the script will follow. In this case, the pattern is: for each product in the results I want to click and see the product details.

product_row "//div[@class='product']" doend

From viewing the source I found that each product is wrapped in div class=”product”. Now for each div with a class=”product” the next action will take place.

Click Product Details Link

The link to the product details page is the name of the product. For example, the link “Hoover Legacy Pet Rewind Vacuum” goes to the details page. Because the link that will be clicked is unique on each row, I need to use the xpath of the link to create a pattern. (Otherwise, I would have used product_link “View Details”.)

    product_link "/form/div[1]/p[1]/a", :generalize => false do

To find the xpath, right-click the first link in Firefox and click view XPath. This is the xpath for the link inside the div: “/form/div[1]/p[1]/a”. This shows that the link was wrapped in a form tag, followed by div, followed by a p (all within div class=”product”).

Now, scRUBYt has a pattern: for each product in the search results, click the link (located inside a form, div and p tag).

Each product in the results has the same HTML structure so the script can easily find the next product in the results.

Grab Title + Description

Now that the script is in the product details page, it’s time to start scraping some data. I wanted two things from this page, the name and description.

	   product_details do 
	     product_record "//p[@class='product-name']" do 
	        name "Homelite 20 In. Homelite Corded Electric Mower" 
	     end 
	     parent "//div[@id='tab-features']" do 
	       description "/p[1]" 
	     end 
	   end

First, from looking at the HTML, I found the name was wrapped in the class “product-name”, so now I can grab the contents. I copied the title from the first product so scRUBYt can create a pattern for the rest of the products. Now the script knows exactly what data to kick back (the text within the “product-name” div).

The description is similar except I used the xpath “/p[1]” to create a pattern instead of copying the whole paragraph of text.

The “name” + “description” preceding the paths above it is the label attached to the data when you export the data later on.

Save to MySQL

The script is pretty much done; it is now searching for hoover vacuums, going to each product page and grabbing the name/description. Now you can save the data to MySQL (or any ActiveRecord DB).

The first step is to name your database columns the same name as what you labelled the data above. In this case, I used “name” and “description”. The mass assignment will automatically save the data to the corresponding DB columns.

	product_data_hash = product_data.to_hash 
 
	product_data_hash.each do |item|
	  @product = Product.create(item)
	  @product.save 
	end

Now, we save it to the DB by converting the data to a hash. Each item creates a DB entry and mass assignment automatically sorts the data into the columns.

Note: you could also save the data to XML or a basic text file.

The Complete Script

You can view the complete script here: http://www.pastie.org/267654

To run the script, run “ruby homedepot.rb” in a command line.

7 Reasons Why My Social Music Site Never Took Off

I read posts on Hacker News about “Why my start-up failed” and found them interesting because it’s a raw start-up story, not the fairy tales that magazines like to write about. So here is my shot at it.

Last summer, I started developing a social music site, Contrastream. The problem: finding good indie music was difficult and a good reason why the 5 big labels control everything. The solution: a Digg-inspired site where you submit albums you want to share and vote up the ones you think are worth listening too, each album gets its own page with a YouTube music video, comments, etc.

7 Reasons Why it Never Took Off

  1. Design Perfection, I’ve heard its common for founders to spend too much time doing what they like or what they are good at. For me, it was spending too much time perfecting the design. Not only how it looks but all the aspects of user experience. This would have been a good thing if I was a designer on a team but there was a lot more I should have been doing. Adding content, SEO, getting press, writing blog posts, and getting to know the early community to name a few.
  2. Underestimated the “Cold Start” problem, I read this article by Bokado Social Design which talks about a big issue you face with a social site, especially when it relies on user-generated content. The value you are providing to your users should centre around the content you put on the site, so to build some base of users, you need to create a lot of content so the first users can kick off the community. In my case, it was having interesting new indie albums always on the site. But user content usually follows the 80/20/1 rule, 80% browse, 20% interact (comment), and 1% contribute (add albums). So even though I had 10,000 people visiting in the early months, I still ended up adding 75% of the content. It was very demanding and time-consuming especially being a solo founder.
  3. Market Size vs Business Model, the market I was targeting with the site, indie fans who knew a lot about music outside of the usual review sites, was small. I had planned to monetize the site with ads and in the process, I realized that you need a LOT of people using your site every day to make money off of ads, or a certain type of people who can’t tell the difference from AdSense and site links. The people I was targeting were also notorious for not clicking on ads (similar to the tech community).
  4. Bad launch, the launch of the site wasn’t planned very well at all. I decided to use the “genius” marketing ploy of having a private beta to create scarcity. I saw a bunch of other sites doing it and figured it was a good strategy. I was wrong. Private betas are good for sites that have either complex technology or something that’s hard to scale, and those two are about the only times you should do that. After the site appeared to be ready to go live, I decided to email Techcrunch about it. At the time I didn’t understand the importance these types of blogs put on exclusives. I figured they would take a day or two and send some questions. But about an hour or two after I emailed them, Techcrunch posted about Contrastream. Not only was I unprepared to have thousands of people come to my site right away, it turns out I didn’t get the chance to fully pitch my site to TC. Michael Arrington called the number I had posted in the whois and privacy policy which was my co-founders’ cell phone. I had used him because my phone was broken at the time. He called at about 11 pm and my friend was sleeping, he also had no idea who Michael Arrington, an internet celebrity to most tech founders, was. In his half-asleep state, he didn’t know what the call was about. I didn’t find out about the call until my friend had woken up a couple hours later and decided to tell me about it… so we were off to a good start.
  5. Competition, we had a lot of high-quality competitors. Our site offered information for great albums, including community voting. However, sites like Last.fm and Hype Machine were offering the actual music. That was a competitive advantage that’s hard to beat and we lacked a significant user base to convince enough people.
  6. Motivation, having to consistently find new content was probably the biggest hit to my motivation for the site. As much as I loved indie music it was draining to constantly find new albums to post up. It turned something I liked doing into a chore mainly because at the same time I was busy marketing the site, redesigning it, and attending classes. (I was in college for business for 18 hours of the week).
  7. Co-founder, I had started the site with a friend, he was smart and knew a lot about business but in the end, could not contribute much to the site. One reason why was that is not technical, so he couldn’t help much with the development of the site and knowing who Michael Arrington was might have helped. The other reason was that he wasn’t really into indie music so providing content and reaching out to the users was a barrier.
  8. Bonus: Derivative Idea, I tried to avoid just listing the common start-up mistakes written by Paul Graham, but this one was pretty accurate. The idea itself for Contrastream wasn’t too innovative or original. It was inspired by Digg.com which had applied the same model to news and articles on the web. I still believe applying this model to music is interesting and useful, but there were so many other me-too dig sites. It was simple software to develop and it could be applied to almost anything. Plus there were open source PHP versions of it available on the web.

4 Important Things I learned in the Process

  1. Ruby on Rails + Mac, I had used PHP+MySQL professionally in my last year of high school but had become rusty over the years. I decided to learn Ruby on Rails to develop Contrastream and I’m completely convinced it was the right decision. This is a language that lets you build applications quickly and stay agile. Like most rails developers I also bought a Mac which is another thing I’m completely convinced about, OSX has some of the best-designed software.
  2. How to Get Press, I learned the importance of having something interesting to say, how to leverage a product launch to get press, and the other basics like press releases and messages.
  3. Practical “Getting Real”, I’ve read this book twice and had the opportunity to apply almost everything with Contrastream. Great way to learn something.
  4. Niche Social Networks are not Businesses, they are communities. They have to be built like a community. That means spending a lot of time building relationships, knowing everything about the niche, and finding users. You can monetize a niche social network but its really important that you don’t approach it like a business.

Bottom line, I learned more from starting this site then I would ever have from college business classes or reading blogs. As a music site, it hasn’t exactly “failed” at the moment, it still pulls in around 3000 people a month mainly from search engines. I now consider it a hobby site and I’m looking forward to applying what I learned with my next start-up, Integrate.