Tuesday, August 25, 2009

Y! Alert: TechCrunch

Yahoo! Alerts
My Alerts

The latest from TechCrunch


Twitter Wants To Track Your Clicks Top
Just before Twitter went down today (yup, it was down again), I noticed something strange. Whenever I clicked on any shortened link in my Twitter stream and look at the address bar of my browser, I saw a fleeting click tracker before it redirected to the final site. It looked something like this: “http://twitter.com/link_click_count . . .” For instance, here is the full URL redirect for one link I managed to capture: http://twitter.com/link_click_count?url=http%3A%2F%2Fbit.ly%2F3omd6p&linkType=web&tweetId=3541772256&userId=12798452. Others noticed this as well. When Twitter came back up, the redirects were gone. Maybe too many people were clicking on them. Whether this was just a test or a preview of what’s to come, it suggests that Twitter wants to track all the links people click on the site, which is something you’d think it was doing already. The way that Twitter was doing the redirects was a bit clumsy. You actually ended up being redirected twice. First by the original URL shortener like bit.ly or ow.ly, and then by Twitter itself. While it only seemed to be happening on Twitter.com itself, the redirects worked for any short link including bit.ly, Tinyurl, ow.ly, and so on. Why would Twitter be tracking links all of a sudden? It’s all about the passed links . First of all, those links are a treasure trove of data. By seeing which links get shared and clicked on the most, Twitter can tell where it is sending the most traffic, who is sending the most traffic, the most popular tweets, the most influential users, and more. All of this data would come in handy for Twitter’s planned analytics service it wants to roll out to business customers. They can do a whole lot more too. First of all, they won’t be relying so much on bit.ly for click data on short links, even though bit.ly remains the default URL shortener on Twitter, and thus the biggest one , for now. With it’s own data, Twitter can move into bit.ly’s backyard and start showing the most popular links and what is being shared right across Twitter. In addition to ternding topics, it coudl also show trending links. (Flickr photo by Gerlos ). Crunch Network : MobileCrunch Mobile Gadgets and Applications, Delivered Daily. TechCrunch50 Conference 2009 : September 14-15, 2009, San Francisco
 
Marketing Decapitation In Poland: Asians Ok, Blacks Maybe Not Top
A reader sent in links to two identical Microsoft marketing sites. One is a standard U.S./English version, the other in Polish. The image is identical, except Microsoft has removed the head of the black man in the U.S. version and photoshopped in a white guy’s head instead. The Asian man and white woman made the cut to the Polish site unscathed. Clicking back and forth between the two pages is actually kind of creepy. The original model’s hand was left in the Polish version of the ad. Photoshop Disasters , which also wrote about this, has some great comments: “I like how that computer monitor isn’t plugged into anything.” “Leave the white Macbook front and center in a MicroSoft Ad? You’d have thought that was the first thing they’d Photoshop out…” “It’s okay if he has a black hand.” Crunch Network : CrunchBase the free database of technology companies, people, and investors TechCrunch50 Conference 2009 : September 14-15, 2009, San Francisco
 
Mag.ma Unleashes Its Directory Of Hot Video Content To The Masses Top
Last May, we got our first taste of Mag.ma , the new video portal to come from Rocketboom founder Andrew Baron . We’ve been tracking the site since then, and it appeared at our Real-Time Crunchup last month, but it’s remained in private beta. Today, Mag.ma finally opens up to the public. As we’ve noted before, most people will use Mag.ma as a great place to kill time. The site presents users with around one hundred video thumbnails on a single page, with hot videos from Twitter, YouTube, Digg, a variety of other services and the web at large. But despite a wide selection of content, the site manages not to overwhelm users. Mag.ma also has a heavy emphasis on stat tracking (you can watch a ticker count up as more people watch a certain video). There’s also a strong discovery component that lets you see videos other members on the site are recommending. Crunch Network : CrunchBase the free database of technology companies, people, and investors TechCrunch50 Conference 2009 : September 14-15, 2009, San Francisco
 
GooseGrade Uses Crowdsourcing To Edit Website Copy Top
GooseGrade, a startup that uses crowdsourcing to let anyone copy edit web sites, has tweaked its service to allow readers to copy edit any web site online. Previously, “citizen editors” could only edit sites which already have the gooseGrade plugin installed. With the launch of the new service, readers only need to install gooseGrade's new browser bookmarklet, which can be found on gooseGrade’s website. When a reader finds an error (spelling, grammar, factual, or otherwise) while browsing a site, they can click the "Copy Edit" button in their browsers' toolbar, highlight the text that they are editing, and submit their edit. Citizen editors will also see any pending edits on the page submitted by other users. In order to see any edits that gooseGrade users have submitted to their web sites, site owners can install a free plug-in for Wordpress, Blogger, Typepad, or any other web site or blog which allows them to accept or decline readers' edits at their own discretion. gooseGrade will show site owners a side-by-side comparison of original and edited text, with an accuracy rating for each gooseGrade user based on how often their edits are accepted by other authors, and if using gooseGrade's Wordpress plugin, accepted edits are automatically changed in the article body after being accepted by the author or site admin. Once an author accepts or declines an edit, it is no longer displayed to others when the bookmarklet is clicked. This seems like a great idea, and an improvement from the site’s original model, which seemed restricting. Launched in 2008, gooseGrade is monetizing its service by offering enterprise versions of its service that can be used in-house by large media companies like CNN or Fox News. Crunch Network : MobileCrunch Mobile Gadgets and Applications, Delivered Daily. TechCrunch50 Conference 2009 : September 14-15, 2009, San Francisco
 
Disqus Forks Into Two Products, Launches Revamped Real-Time Comment System Top
Only a month after rival comment system JS-Kit launched Echo, a real-time comment system, Disqus is striking back with its biggest upgrade since the service launched. Along with a revamped comment system, Disqus 3.0 is also what CEO Daniel Ha calls a “conceptual reconstruction” for the service: Disqus will now be split into two separate but complimentary products, called Comments and Profiles , in recognition of the way two distinct sets of users have been using the commenting engine. For those who just got worried about losing the Disqus they know and love, fear not: the service isn’t changing all that much. The Comments product has been revamped, but it’s still the commenting engine that bloggers can embed using a few lines of Javascript (we’ve embedded the new comment system below this post if you’d like to try it out for yourself). The new product here is Profiles, which Ha says is designed to cater not to blog administrators, but to empower the people who are actually out there leaving comments. Profiles isn’t yet another social network you have to maintain. Instead, it’s a central hub for all of the comments you’ve left throughout the web using your Disqus account. From the Profiles panel, you’ll be able to quickly jump to any blog post you’ve ever left a comment on. You’ll also be given control over which comments stay attributed to you — if you find a comment you no longer wish to be associated with, you can delete it from your profile and it will be attributed to an anonymous user on the blog you left it on (I think this could lead to some problems, which I’ll get to below). There’s also an improved account connection system, which allows Disqus to identify comments you’ve left using login systems like Twitter and Facebook Connect and merge them into your profile. The service also features a new emphasis on helping users syndicate their comments out to these social services. In terms of actually commenting, the new system has a lot in common with Echo. When you leave a comment, it will appear in the comment stream without refreshing. You can also watch other users leave comments in real-time without needing to refresh — you can optionally watch a ’streaming’ mode, where they appear as they are written, or a ‘queued mode’, which is similar to Twitter Search and the Facebook News Feed in that it shows you a message like “Five new comments.” Busy sites will almost certainly want to enable the latter, as a flowing stream of comments can grow unwieldy quickly (something we saw during our Echo demo ). The service has also increased the speed with which it can suck in comments made about a post elsewhere on the web, like Twitter. Disqus has offered this feature for some time now, but Ha says that the speed has been improved substantially, with around a minute delay to pull in comments that are left offsite. The comment system’s moderation panel has been overhauled as well to accommodate publishers of all sizes, using a Gmail-like label system, along with a number of improvements to its spam tools. My biggest concern with Disqus 3.0 at this point involves the added control the new Profile feature gives to users. As I mentioned before, you’ll now be able to see a list of every comment you’ve left, and if you find one you no longer wish to have associated with your name, you can simply delete it from your profile. This, in turn, will ‘anonymize’ the comment on the blog it was left on (it will just say ‘Guest’ rather than your name). Blog administrators will always be able to see the name of the person who originally left the comment and can choose to revert back to the user name, but there’s currently no way to disable the feature entirely. This may sound like a minor inconvenience, but it can be problematic for some blogs. For example, sometimes we’ll get comments in our posts that we’ll refer to later on — comments that would lose their value if they no longer reflected who wrote them. Granted, we can always force it back to show the original user name, but this is just one more thing to keep track of. All of that said, I won’t be surprised if Disqus gives blog admins the ability to turn this off in the near future. Disqus 3.0 looks like a solid upgrade and seems to work quite well. In terms of features, Disqus has a few that Echo lacks (like the ability to pause a stream), while Echo has a few advantages (like the ability to embed video and images, and the option to syndicate to more services including Google and Blogger). But far more important is how the system performs in a real life scenario, under the stresses of day-to-day blog management. Which is why we’ve embedded the revamped comment system below — let’s put this through its paces, but try to keep it sane. We’ll be updating this post later in the day with a few of our observations. View the discussion thread. blog comments powered by Disqus Crunch Network : CrunchBase the free database of technology companies, people, and investors TechCrunch50 Conference 2009 : September 14-15, 2009, San Francisco
 
Apigee's Ambition Is To Be The Google Analytics For APIs Top
The Web has come a long way from a collection of Web pages to repositories of programmable data freely flowing from one site to another through APIs (Application Programming Interfaces). But it is difficult to keep track of all of the thousands of APIs out there. Wouldn’t it be great if there was a Google Analytics for APIs? Today, Sonoa Systems is taking the first step towards building just that with the public beta launch of Apigee . (Developers can sign up for it here ). Sonoa built the service around its massively scalable API router. Apigee offers Web developers and publishers a dashboard for managing both the APIs they provide to others and the ones they consume for themselves. Easy to understand charts and graphs can tell a Web developer which one of five APIs is messing up his mashup or a bigger Website when an overzealous developer is making too many calls to its API and threatening to bring the whole thing down. “The way we see APIs is like the dark matter of the Internet,” says Brian Mulloy, the general manager of Apigee. “We know they are out there, but we are not directly observing their behavior.” For a developer whose Website is a mashup of five APIs, Apigee would monitor each one for response time, error rates, and number of requests being put through (to make sure the developer is not hitting the limit). In the illustrative screenshot below, the example is for a fake app called “PhotoTwitty” which is a mashup of APIs from Twitter, Twitter Search, Flickr, Google, and 4PayPrints. All five have to work in order for PhotoTwitty to be up and running. Big sites with popular APIs have a different problem. They set limits so their servers don’t crash, but sometimes all it takes is one or two heavy users to cause a meltdown. Just because an API providers set limits doesn’t mean they enforce them. Apigee lets sites throttle API limits with simple sliders and once the limits are hit, any individual API user cannot exceed them. The reason Apigee can do this is because it is in fact creating a proxy for each API. “We want every API going through us,” says Mulloy. It uses Sonoa’s API routers to create its proxies and from then on all the API calls are routed through Apigee. The proxy works both ways. Either the API provider or consumer can set one up. Since you can’t drop a Javascript beacon in an API like you can on a Website, this is the only way to measure the broad usage of APIs. If enough APIs start going through Apigee’s proxies and its users allow the data to be shared anonymously in an aggrgegate fashion, then Apigee could eventually create an Alexa-like service as well with public data for API usage and uptimes. The downside to using Apigee is that it introduces a latency into the whole system. Mulloy estimate sthat this latency is only 200 to 300 milliseconds, which is is acceptable for most apps, but starts to become noticeable for real-time apps like Twitter. For URL Shorteners , for example, 300 milliseconds can mean the difference between an acceptable and unacceptable lag. But that’s the price you pay for visibility into your APIs. You may also be paying Apigee, which wants to charge up to $100 a month for anyone monitoring more than 10,000 API calls an hour. Below that, the service is free. Competitors include Mashery and 3Scale . Crunch Network : CrunchBoard because it’s time for you to find a new Job2.0 TechCrunch50 Conference 2009 : September 14-15, 2009, San Francisco
 

CREATE MORE ALERTS:

Auctions - Find out when new auctions are posted

Horoscopes - Receive your daily horoscope

Music - Get the newest Album Releases, Playlists and more

News - Only the news you want, delivered!

Stocks - Stay connected to the market with price quotes and more

Weather - Get today's weather conditions




You received this email because you subscribed to Yahoo! Alerts. Use this link to unsubscribe from this alert. To change your communications preferences for other Yahoo! business lines, please visit your Marketing Preferences. To learn more about Yahoo!'s use of personal information, including the use of web beacons in HTML-based email, please read our Privacy Policy. Yahoo! is located at 701 First Avenue, Sunnyvale, CA 94089.

No comments:

Post a Comment