How to be a Gmail Power User [Infographic]

The Internet has been trying to kill email for forever and so far, it’s not doing a good job. Why else would there be so many guides and articles out there teaching you how to achieve Zero Inbox or how to be a Gmail power user?

This infographic adds to the pile with plenty of doable tips that can help you work faster (use the keyboard shortcuts), more productively (canned responses, auto-archives), and be more organized (use filters, folders and labels) on Gmail.

My favorite of the lot has got to be the Undo Send feature which has saved me too many times than I’d like to admit. It’s also interesting to know that you can set the cancellation period for up to 30 seconds after you hit Send. Check out the infographic for more tips and hidden Gmail secrets a power user should know.

power gmail user infographic



10 Web Scraping Tools to Extract Online Data

For personnel who are actively looking for more candidates to join their team, or for jobseekers who are looking for a particular role or job vacancy, these tools also work great to effortlessly fetch data based on different applied filters, and to retrieve data effective without manual searches.

If you are into online shopping and love to actively track prices of products you are looking for across multiple markets and online stores, then you definitely need a web scraping tool.

Web Scraping tools are specifically developed for extracting information from websites. They are also known as web harvesting tools or web data extraction tools. These tools are useful for anyone trying to collect some form of data from the Internet. Web Scraping is the new data entry technique that don’t require repetitive typing or copy-pasting.

Use Cases of Web Scraping Tools

Spinn3r indexes content similar to Google and saves the extracted data in JSON files. The web scraper constantly scans the web and finds updates from multiple sources to get you real-time publications. Its admin console lets you control crawls and full-text search allows making complex queries on raw data.

Collect Data for Market Research

Spinn3r allows you to fetch entire data from blogs, news & social media sites and RSS & ATOM feeds. Spinn3r is distributed with a firehouse API that manages 95% of the indexing work. It offers an advanced spam protection, which removes spam and inappropriate language uses, thus improving data safety.

Extract Contact Info

CloudScrape also supports anonymous data access by offering a set of proxy servers to hide your identity. CloudScrape stores your data on its servers for 2 weeks before archiving it. The web scraper offers 20 scraping hours for free and will cost $29 per month.

Download Solutions from StackOverflow

Web Scraping tools can be used for unlimited purposes in various scenarios but we’re going to go with some common use cases that are applicable to general users.

Look for Jobs or Candidates

VisualScraper is another web data extraction software, which can be used to collect information from the web. The software helps you extract data from several web pages and fetches the results in real-time. Moreover, you can export in various formats like CSV, XML, JSON and SQL.

Track Prices from Multiple Markets

Its ‘Datafiniti‘ lets you search the entire data quickly. 80legs provides high-performance web crawling that works rapidly and fetches required data in mere seconds. It offers a free plan for 10K URLs per crawl and can be upgraded to an intro plan for $29 per month for 100K URLs per crawl.

10 Best Web Scraping Tools uses cutting-edge technology to fetch millions of data every day, which businesses can avail for small fees. Along with the web tool, it also offers a free apps for Windows, Mac OS X and Linux to build data extractors and crawlers, download data and sync with the online account.

ParseHub, apart from the web app, is also available as a free desktop application for Windows, Mac OS X and Linux that offers a basic free plan that covers 5 crawl projects. This service offers a premium plan for $89 per month with support for 20 projects and 10,000 webpages per crawl. provides direct access to real-time and structured data from crawling thousands of online sources. The web scraper supports extracting web data in more than 240 languages and saving the output data in various formats including XML, JSON and RSS. is a browser-based web app that uses an exclusive data crawling technology to crawl huge amounts of data from multiple channels in a single API. It offers a free plan for making 1000 requests/ month, and a $50/mth premium plan for 5000 requests/month.

Scrapinghub is a cloud-based data extraction tool that helps thousands of developers to fetch valuable data. Scrapinghub uses Crawlera, a smart proxy rotator that supports bypassing bot counter-measures to crawl huge or bot-protected sites easily.


OutWit Hub lets you scrape any web page from the browser itself and even create automatic agents to extract data and format it per settings. It is one of the simplest web scraping tools, which is free to use and offers you the convenience to extract web data without writing a single line of code.

Scrapinghub converts the entire web page into organized content. Its team of experts are available for help in case its crawl builder can’t work your requirements. Its basic free plan gives you access to 1 concurrent crawl and its premium plan for $25 per month provides access to up to 4 parallel crawls.


Scraper is a Chrome extension with limited data extraction features but it’s helpful for making online research, and exporting data to Google Spreadsheets. This tool is intended for beginners as well as experts who can easily copy data to the clipboard or store to the spreadsheets using OAuth.

80legs is a powerful yet flexible web crawling tool that can be configured to your needs. It supports fetching huge amounts of data along with the option to download the extracted data instantly. The web scraper claims to crawl 600,000+ domains and is used by big players like MailChimp and PayPal.


Using a web scraping tool, one can also download solutions for offline reading or storage by collecting data from multiple sites (including StackOverflow and more Q&A websites). This reduces dependence on active Internet connections as the resources are readily available in spite of the availability of Internet access.

These tools can also be used to extract data such as emails and phone numbers from various websites, making it possible to have a list of suppliers, manufacturers and other persons of interests to your business or company, alongside their respective contact addresses.


Web scraping tools can help keep you abreast on where your company or industry is heading in the next six months, serving as a powerful tool for market research. The tools can fetchd ata from multiple data analytics providers and market research firms, and consolidating them into one spot for easy reference and analysis.

CloudScrape supports data collection from any website and requires no download just like Webhose. It provides a browser-based editor to set up crawlers and extract data in real-time. You can save the collected data on cloud platforms like Google Drive and or export as CSV or JSON.


ParseHub is built to crawl single and multiple websites with support for JavaScript, AJAX, sessions, cookies and redirects. The application uses machine learning technology to recognize the most complicated documents on the web and generates the output file based on the required data format.


You can easily collect and manage web data with its simple point and click interface. VisualScraper comes in free as well as premium plans starting from $49 per month with access to 100K+ pages. Its free application, similar to that of Parsehub, is available for Windows with additional C++ packages. offers a builder to form your own datasets by simply importing the data from a particular web page and exporting the data to CSV. You can easily scrape thousands of web pages in minutes without writing a single line of code and build 1000+ APIs based on your requirements.


Let’s take a look at the 10 best web scraping tools available. Some of them are free, some of them have trial periods and premium plans. Do look into the details before you subscribe to anyone for your needs.

OutWit Hub is a Firefox add-on with dozens of data extraction features to simplify your web searches. This tool can automatically browse through pages and store the extracted information in a proper format. OutWit Hub offers a single interface for scraping tiny or huge amounts of data per needs.

OutWit Hub

These software look for new data manually or automatically, fetching the new or updated data and storing them for your easy access. For example, one may collect info about products and their prices from Amazon using a scraping tool. In this post, we’re listing the use cases of web scraping tools and the top 10 web scraping tools to collect information, with zero coding.

Scraper is a free tool, which works right in your browser and auto-generates smaller XPaths for defining URLs to crawl. It doesn’t offers you the ease of automatic or bot crawling like Import, Webhose and others, but it’s also a benefit for novices as you don’t need to tackle messy configuration.

Which is your favorite web scraping tool or add-on? What data do you wish to extract from the Internet? Do share your story with us using the comments section below.

How Big Data Analytics Make Cities Smarter

On top of that, smart cities have already started testing systems that allow elderly patients the option to remain in their homes instead of at a nursing care facility. These type of systems include a standalone table, a tablet with Skype and wireless home sensors used for video communication between the patient and their remote caregiver.

The introduction of big data in the education space has encouraged students of all ages to learn remotely in the comfort of their homes. These massive open online courses collect data from millions of course takers and analyze it to find trouble areas that are causing students to fail. After analyzing millions of data points, algorithms continually updated each course to deliver an "adaptive learning experience" based on each individual’s strength, weaknesses and preferences.

Furthermore, any upgraded transportation option appeals to established businesses looking for a new locale, as they do to startup businesses. Any business wants to know that their workers and clients have access to efficient modern transportation. That access lowers annual budgets for businesses in terms of what they pay in gas mileage and delivery costs.

Many major cities are starting to use INRIX, a system that analyzes data from traditional road sensor networks and mobile device data. San Francisco’s Metropolitan Transportation Commission saved over $250,000 per year from the direct data collection of INRIX. The Baltimore Metropolitan Council saved $25,000 per year in fuel and labor cost because of increased efficiency.

Big Data in Smart Cities

The collection and analysis of big data helps educators understand which students need help, why they need help as well as identifying areas in which they excel.

  • Less automobile congestion and fewer accidents
  • More advancements in faster long distance travel
  • Clean air from the reduction of pollution
  • Excess of new jobs from updates in transportation networks

If implemented on a wide scale, smart energy efficiency systems could save the United States more than $1.2 trillion by using IoT devices and big data analytics.

All industries can expect to see a boom as people become healthier and live longer which will ultimately cause new demands for food and housing production. Past traditional methods will supply very little of our production because big data and smart technologies has already started making monumental impacts on our way of life and growth of civilization.

If major cities were to invest into smart transport systems today, then by 2030 they would save around $800 billion annually. On top of that, smart transport systems also contribute in a few other ways, including:


Big Data in Law Enforcement

Big data can already predict the outbreaks of viruses and even track cases of depression. Smart cities will use millions of sensors that provide personalized medical services. Many citizens of smart cities will be able to activate their medical service by a mobile app or free standing kiosks throughout the city. Pulsepoint Respond is a great example of a personalized app that alerts CPR-trained bystanders of sudden cardiac arrests within their immediate area.

This post is submitted via our Contact Form. Click here to learn more about it.

The analytics provide more three-dimensional insights of their students’ progress while allowing parents a way to understand how each child learns. AltSchool is one of the first K-8th grade school providing this personalized learning experience which is only available in developing smart cities such as San Francisco and New York.

The United Nations says that by 2050, 66% of the world’s population will be considered urban. With populations living in such close proximities, this means that health initiatives must be available to everyone no matter their background, race or economic status.


Big data tracks transportation infrastructure needs and costs helping cities define ways to expand their public transport options in the most efficient way possible. It defines what areas of the city need to open up and how receptive people are about initiatives to raise money for such a project. Cities that use this type of big data analytics are called smart cities and much of the world wants in on the innovations.

Big Data in Education

Innovative ideas always gain traction and cannot be withheld as they expand across countries and eventually the globe. The process of moving forward exemplifies the survival of humanity, even though recent technologies have drastically changed our methods of forward momentum.

These are just two examples of the many ways smart cities are adapting schools into more personalized and remote learning platforms which may change the learning experience forever.

Over 75% of the world’s energy consumption come from cities and 40% of municipal energy cost come solely from street lighting. Since adopting smart street lights which automatically adjust light levels to suit the needs of citizens, Lansing, Michigan saved 70% of their energy cost.


Many local agencies are starting to use PREDPOL or predictive policing systems that collect three main data points from every report: type of crime, location and time of the incident, to make accurate officer deployment decisions in the future.

Once high criminal activities are identified, new education initiatives and outreach programs can be utilized in those jurisdictions.

Big Data in Health

Contrary to popular belief, in terms of fighting crime, big data is actually allowing police and other law enforcement officers to behave less like Big Brother than more. Data analytics allows law enforcement officers to track real trouble spots and dangerous criminals.

Since using PREDPOL, the town of Reading, Pennsylvania, has seen crime drop to the lowest rate in over 35 years with a 19% drop in violent crimes and a staggering 44% drop in burglaries. Santa Cruz, California, saw similar result with their first as burglaries dropped by 11% and robberies drop by 27%.


Educators can provide relevant individual and group activities to support each student’s goals and needs. Teachers will be able to assess student progress on a consistent basis in order to challenge students and help them grow.

Moreover, the Spanish town of Santander installed 12,500 air pollution and RFID sensors around the city which diminished energy costs by 25% and waste management cost by an additional 20%. Smart cities are barely underway, yet they are already making substantial impact on the environment and to the citizens living in them.

Big Data in Energy Usage

As the interlacing of big data and IoT get stronger, this will only improve our standards of living, create sustainable efficiency and cultivate smart cities in a number of significant ways.

Besides predictive policing, the FBI invested 1 billion dollars into a next generation identification system which uses a combination of DNA, fingerprints, 3D facial photos and voice recognition to pinpoint criminals. As smart cities start to implement these systems into their infrastructure, crimes will be mapped before they even happen and criminals can be identified within seconds.

Some experts believe that cities could be investing nearly $400 billion a year building smart cities in the next 5 years. The government has even launched a smart city initiative to help communities tackle local challenges and improve city services.

The 2025 Forecast

Editor’s note: This is a post written by Andrew Deen for Andrew has been a consultant for startups in almost every industry from retail to medical devices. He implements lean methodology and is currently writing a book about scaling up businesses. Contact Andrew on Twitter.

Today, the progressive revolution can be seen all around, especially with the adaptation of big data within our daily lives. Our civilization can algorithmically process massive amounts of data to provide us a detailed understanding of our choosing.

The wireless sensors monitor the house and send alerts about safety situations such as a left-on stove or doors opening in the middle of the night. After testing this system in Oslo, Norway, the study has shown that the system can save $85,000 for each person since they don’t have to move into a nursing facility.

Experts predict that by 2020 there will be over 100 million of these smart light bulbs and lamps used worldwide. Other cities like Charlotte, North Carolina have implemented smart building energy management which cut their total energy use by 8.4% and greenhouse gas emissions by 20%.

20 New Tech Words You Should Know

How to use it:

"The technology of that app is great, but it doesn’t take peopleware into account so it’s less user friendly."

"We got on the whitelist for the restaurant’s reservation list, so we can brunch anytime we want there."

"I was hooting with a tourist from Australia last night."

1. Digitize

How to use it:

(noun): An analysis of your spending habits. (source)

How to use it:


2. Fakersation

How to use it:

(verb): Managing, analyzing, and organizing your cloud storage. (source)

"I happify by writing down three great things that happened to me at the end of every day."

3. Rankify

(noun): A list granting access/approval to certain entities that prove to be valid.

I mean, even commonplace phrases like "Google it" didn’t used to be a thing. Looking back, it’s crazy to think that these terms weren’t always a part of our daily vocabulary. So what else is to come?

How to use it:

4. Buffer

new tech words

"Digitizing books into an eBook device was a brilliant idea."

"My date was terrible last night, so I used a fakersation to leave early."

5. Moodle

"We went to the coolest bar last night that a local Londoner hooted us about."

How to use it:

(noun, verb): A method of crowdsourcing locals to get tips on the best places to go. (source)

6. Breadcrumbs

Hashtag. Podcast. Emoji. Streaming. A few years ago, these terms did not exist. In fact, they weren’t even real words. But now they are a part of our daily lives and have successfully been incorporated into our everyday jargon.

How to use it:

7. Videotize

How to use it:

(verb): The ability of turning your message into a video, for your convenience and ease of access. (source)


How to use it:

(verb): The process of turning an image into a GIF.

(noun): The interaction of mood and IQ. (source)


9. GIFify

(noun): Temporary, short term storage of data in a memory bank while transferring a large amount of data.

How to use it:

How to use it:

How to use it:

10. Loup

"All I was trying to do was answer someone’s question, but they flamed me for my opinion."

How to use it:

"My credit card just got rejected. I need to monitor my spendlytics."

11. Gigaflops

(verb): The process of making upsurges in life. (source)

"I’m so lost. Can we use breadcrumbs to find our way back to where we came from?"

"Ever since Tasty videotized cooking, I’ve become addicted to watching people cook."

12. Spendlytics

How to use it:

"My moo-q is off today – I didn’t get enough sleep last night."

"Man, my place is such a mess. I should uncloud it and see what I actually need."

13. Hoot

(noun, verb): Similar to trolling, flaming is the act of posting offensive and insulting comments. This can be intentional or unintentional.

How to use it:

"My gigaflops at work today were pretty weak. I need to focus better."

How to use it:

14. Hi-res

Editor’s note: This is a contributed post by Kira Bloom for Kira was born and raised in Texas but currently lives in Tel Aviv. A chocoholic, outdoor enthusiast and tech junkie, Kira loves writing. You will often find her outdoors with a good book in hand. Follow her on Twitter, Facebook and check out her blog.

How to use it:

"Did you see the flaming going on in the Reddit New York board? Crazy!"

15. Moo-Q

(noun): The role that humans play in technology, specifically in terms of computers.

How to use it:

"I got commjacked when I connected to public Wi-Fi and they took my credit card info."

16. Peopleware

(adjective): Refers to high quality that shows a lot of detail.

"The next bus isn’t for 20 minutes. Let’s just grab a Loup."

Here are a few new words that we suggest you become acquainted with:


17. Commjacking

"I used moodle to get the info and earned an A on my exam."

(noun): A conversation that is faked for the sake of removing yourself from an unpleasant or awkward situation. (source)

"Sweet, I rankified my job and got promoted!"

18. Uncloud

(verb): Hijacking the data sent over communication channels. (source)

"That’s a hilarious picture. We should GIFify it."

(verb): The process of making something digital.

19. Whitelist

(noun): A unit of measurement that detects computer speed and performance.

(noun): An open source learning platform that is distributed freely, it primarily helps assist with e-learning. (source)

"I don’t even remember what was on that test. I studied for it using the buffer method."

20. Flaming

How to use it:

(verb): Using positive psychology to make yourself happier. (source)

(noun): A new mode of transport that combines the accessibility of a bus route with the comfort of a private cab. (source)

"His stories are so hi-res, it feels like I’m there."

(noun): An option in a site that shows you where are in relation to the site itself. A tool that makes navigation easy.

How to use it:

URL Shortening Services, The Ultimate List

[March 9, 2016] Update: Due to the changing nature of this topic, this post has been updated with new content.

URL shortening services keeps things short and sweet and easily shareable in conversations, forum threads, social media etc. Some services are used to hide affiliate links or to bypass firewalls that block websites according to URLs.

By the way, if you know of more URL shortening services, let us know.

If you’re bored with the regular services, know that the Internet is home to plenty of other options, and in this post we have 40 of them, most of which you probably have not heard of. Some of them offer more than just shorten-url-to-go services. We’ve briefly listed these options in a column but if you want to know more, do check out the services and try them out for size.

40 URL Shortening Services

Service Name After shortened (sample) More Options? Bookmarklet API Validity time, custom Short-URL, password Yes Yes No No No No Custom alias, password, description, geotargetting, URL redirect No No No Custom short link No Custom short link No Custom short link No No Yes Custom short link, standard / lower case, log statistic No Yes
linkmoji http://🌽🚀📦🍉🍎🐹🐶🍒.🍕💩.ws No No Custom short URL, password protect No No No No No No No No No Yes No No custom Short-URL, validity, password for statistic, QR code enabled Yes Yes No No Custom URL, password, expiry date, total uses No Yes Custom alias, password, description, geotargetting, URL redirect No Custom URL, password, expiry date, total uses, public / private URL No No No Custom URL, log stat No Custom alias Yes Custom keyword, custom title, click limit, time limit Yes Yes No No Custom short link, standard / lower case, log statistic No Yes Short URL and very short URL No Yes Custom URL No Yes Custom URL, password protect, description Yes Yes No No Statistic No Yes Custom URL No No No No No No Yes Yes No No No Yes No No Yes