QuickCode
   HOME

TheInfoList



OR:

QuickCode (formerly ScraperWiki) was a web-based platform for collaboratively building programs to extract and analyze public (online) data, in a
wiki A wiki ( ) is an online hypertext publication collaboratively edited and managed by its own audience, using a web browser. A typical wiki contains multiple pages for the subjects or scope of the project, and could be either open to the pu ...
-like fashion. "Scraper" refers to
screen scraper Data scraping is a technique where a computer program extracts data from human-readable output coming from another program. Description Normally, data transfer between programs is accomplished using data structures suited for automated process ...
s, programs that extract data from websites. "Wiki" means that any user with programming experience can create or edit such programs for extracting new data, or for analyzing existing datasets. The main use of the website is providing a place for
programmer A computer programmer, sometimes referred to as a software developer, a software engineer, a programmer or a coder, is a person who creates computer programs — often for larger computer software. A programmer is someone who writes/creates ...
s and
journalist A journalist is an individual that collects/gathers information in form of text, audio, or pictures, processes them into a news-worthy form, and disseminates it to the public. The act or process mainly done by the journalist is called journalism ...
s to collaborate on analyzing public data. The service was renamed circa 2016, as "it isn't a wiki or just for scraping any more". At the same time, the eponymous parent company was renamed 'The Sensible Code Company'.


Scrapers

Scrapers are created using a browser based IDE or by connecting via SSH to a server running
Linux Linux ( or ) is a family of open-source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991, by Linus Torvalds. Linux is typically packaged as a Linux distribution, which i ...
. They can be programmed using a variety of programming languages, including
Perl Perl is a family of two High-level programming language, high-level, General-purpose programming language, general-purpose, Interpreter (computing), interpreted, dynamic programming languages. "Perl" refers to Perl 5, but from 2000 to 2019 it ...
, Python,
Ruby A ruby is a pinkish red to blood-red colored gemstone, a variety of the mineral corundum (aluminium oxide). Ruby is one of the most popular traditional jewelry gems and is very durable. Other varieties of gem-quality corundum are called sapp ...
,
JavaScript JavaScript (), often abbreviated as JS, is a programming language that is one of the core technologies of the World Wide Web, alongside HTML and CSS. As of 2022, 98% of Website, websites use JavaScript on the Client (computing), client side ...
and R.


History

ScraperWiki was founded in 2009 by Julian Todd and Aidan McGuire. It was initially funded by 4iP, the venture capital arm of TV station
Channel 4 Channel 4 is a British free-to-air public broadcast television network operated by the state-owned enterprise, state-owned Channel Four Television Corporation. It began its transmission on 2 November 1982 and was established to provide a four ...
. Since then, it has attracted an additional £1 Million round of funding from Enterprise Ventures.
Aidan McGuire Aidan or Aiden is a modern version of a number of Celtic language names, including the Irish male given name ''Aodhán'', the Scottish Gaelic given name Aodhan and the Welsh name Aeddan. Phonetic variants, such as spelled with an "e" instead ...
is the
chief executive officer A chief executive officer (CEO), also known as a central executive officer (CEO), chief administrator officer (CAO) or just chief executive (CE), is one of a number of corporate executives charged with the management of an organization especial ...
of The Sensible Code Company


See also

* Data driven journalism *
Web scraping Web scraping, web harvesting, or web data extraction is data scraping used for extracting data from websites. Web scraping software may directly access the World Wide Web using the Hypertext Transfer Protocol or a web browser. While web scrapin ...


References


External links

*
github repository of custard
Collaborative projects Wikis Social information processing Web analytics Mashup (web application hybrid) Web scraping Software using the GNU AGPL license {{wiki-stub