Archive for the ‘Perpustakaan & Teknologi’ Category

Pena “Ajaib” Advan bak Sekretaris Pribadi

August 25, 2011

Saat melakukan rapat, seringkali dokumen hasil corat-coret merepotkan. Catatan pembicaraan, bagan, atau skema yang dibahas terpaksa harus disimpan terus meski berisiko hilang. Kalau harus dipindah lagi dalam bentuk digital ke komputer membutuhkan waktu.

Nah, sekarang tidak perlu repot lagi karena sudah ada pena digital yang akan secara otomatis menyimpan hasil corat-coret, gambar, atau tulisan tangan di kertas ke dalam file digital. Ibarat seorang sekretaris pribadi, pena digital tersebut membantu pemiliknya mendokumentasikan semua coretan tangan.

Salah satu pena digital yang beredar di Tanah Air adalah i-Note dari Advan. Apa itu i-Note? i-Note merupakan pena digital produksi Advan yang bisa berfungsi untuk menulis di kertas dan layar monitor sekaligus. Produk ini terdiri dari pena yang dilengkapi baterai dan chip di dalamnya dan unit dasar peng-klip kertas dan bisa terhubung dengan PC atau smartphone lewat kabel.

Pada dasarnya, produk ini bekerja dengan koneksi Bluetooth. Ketika pena digerakkan, chip dan sensor yang ada pada unit dasar akan bekerja sehingga hasil tulisan di atas kertas bisa disalin di layar monitor untuk selanjutnya bisa disimpan dan diubah dalam format JPEG.

Cara pemakaian pena ini cukup sederhana. Pertama, unit dasar perlu dicharge selam 3 jam untuk bekerja maksimal sementara pada pena dimasukkan 2 baterai SR41, baterai bulat seperti untuk jam tangan. Software Note Manager perlu diinstal di PC.

Setelah 3 hal di atas selesai, pena siap digunakan. Pertama, buka aplikasi Note Manager yang sudah disertakan di PC dan sematkan kertas pada unit dasar. Pastikan halaman penulisan di layar monitor telah terbuka. Selanjutnya, pengguna bisa langsung menulis di kertas dan secara real time akan tersalin di layar monitor.

Sekedar tips dalam menulis, pastikan posisi tangan tak menghalangi sensor unit dasar. Ketika menggenggam pena, sebaiknya tangan diletakkan di belakang dan memirinkan pena antara 45-90 derajat. Pastikan pula gambar pena sudah muncul di layar unit dasar.

Ketika selesai satu halaman, tekan tombol power pada unit dasar untuk memulai halaman baru. Jika hendak menghapus catatan yang tersalin di layar monitor, cukup pilih menu “clear notes”. Setelah catatan selesai, simpan dan ubah dalam format JPEG dan surat atau catatan pun siap dikirim online.

Dengan i-Note, catatan atau surat elektronik yang dikirim bisa lebih personal sebab menunjukkan tulisan tangan sang pengguna. i-Note juga bisa dipakai untuk membubuhkan tanda tangan dalam surat. Pastinya, ini akan menunjukkan bahwa penggunanya lebih melek teknologi.

Jika diubah ke “mouse mode”, i-Note juga bisa berfungsi sebagai mouse. Cukup digerakkan untuk menuju lokasi layar tertentu dan ditekan untuk memilih menu. Meski demikian, menu “mouse mode” cukup merepotkan bagi yang belum terbiasa.

Jika membeli i-Note, pengguna juga akan mendapatkan CD berisi software Photo Sktecher. Dengan software ini, pengguna bisa menambahkan sketsa atau keterangan pada gambar yang sudah disimpan di PC. Pena bahkan juga bisa dipakai untuk menggambar sebab dilengkapi dengan menu warna.

i-Note bisa dipakai untuk PC berbasis Windows dan Mac. Tak cuma itu, pena digital ini juga bisa digunakan untuk mobile device macam Blackberry. Cukup download aplikasi di Blackberry Apps, maka i-Note siap dipakai. Selain untuk Blackberry, bisa juga dipakai untuk device berbasis Android. Aplikasinya bisa diunduh cuma-cuma di BlackBerry App World atau Android Market.

Selain untuk mendikumentasikan tulisan tangan, i-Note juga membantu pengguna berkomunikasi lebih personal. Misalnya, saat akan mengirim tulisan lewat email. Kini dengan i-Note, surat elektronik yang lebih personal bisa dibuat karena bentuk tulisan sendiri.

i-Note sangat inovatif meski menyimpan beberapa kekurangan. Misalnya, kadang proses penyalinan dari kertas ke monitor agak lama dan tak sempurna. Tak jarang, huruf yang ditulis jadi tak lengkap. Kedua, sensor hanya bisa menangkap dari jarak yang dekat, tak bisa di seluruh area kertas. Namun begitu, produk ini tetap layak dicoba. KOMPAS.com – Yunanto Wiji Utomo | Tri Wahono

http://tekno.kompas.com/read/2011/07/31/09413329/Pena.Ajaib.Advan.bak.Sekretaris.Pribadi

Mozilla Akan Luncurkan Firefox 6 yang Lebih Cepat dan Stabil

August 25, 2011

Mozilla merencanakan untuk segera meluncurkan update terbaru Mozilla Firefox, yakni versi 6. Mereka mengatakan, update terbaru ini akan diluncurkan pada Selasa depan.

Terdapat beberapa perbedaan dan perbaikan. Yang paling kentara adalah adalah peningkatan kecepatan dan stabilitas. Selain itu Mozilla Firefox 6 juga akan menampilkan nama domain di addres bar.

Selain itu, versi 6 firefox ini juga didukung HTML 5, enhanced add-ons manager, fitur baru untuk panorama, dan pengaturan izin window.

Tak hanya itu, Mozilla juga mengumumkan perihal fitur mozilla firefox versi 8. Dengan versi 8, para penggunanya dapat mencegah munculnya add-on yang tidak diinginkan. Penggunanya dapat menentukan mana-mana add-on yang ingin dipakai dan mana yang tidak dipakai.

By imam on August 14th, 2011

http://www.beritateknologi.com/minggu-depan-mozilla-akan-luncurkan-firefox-6-yang-lebih-cepat-dan-stabil/

 

Kiat Pemasaran di Media Sosial

August 25, 2011

Media sosial atau situs jejaring sosial semacam Facebook, Twitter, dan Koprol sudah menjadi lebih dari sekadar ajang berteman. Media sosial juga bisa berfungsi sebagai media membangun kepedulian dan memasarkan produk. Lalu, bagaimana cara memanfaatkan media sosial untuk tujuan itu?

Danny Wirianto, Chief Marketing Officer KasKus, forum internet terbesar di Tanah Air, mengungkapkan beberapa hal yang harus diperhatikan dalam memulai pemasaran di media sosial. Pertama, kata Danny, “Read before write.” Artinya, proses pemasaran lewat media sosial harus didahului dengan riset.

Riset akan mengefektifkan mekanisme pemasaran yang dilakukan. Riset bisa berkisar tentang karakter pengguna media sosial, misalnya dengan melihat cara komunikasi dan apa yang dikomunikasikan dalam media itu.

“Kenali apa yang mereka bicarakan,” kata Danny. “Setelah tahu, lalu buat analisis. Susun strategi kecil dan lemparkan. Lakukan inception, persis seperti yang ada di film Inception itu,” papar Danny dalam ajang SparxUp Seminar bertema “Understanding Social Media 2011” di Jakarta, pekan lalu.

Untuk melakukannya, berbagai cara bisa diaplikasikan. Danny mencontohkan satu langkah yang bisa diterapkan. Caranya, mengomunikasikan secara personal sehingga produk baru seolah dianggap rahasia. Dengan cara itu, orang akan dianggap istimewa.

“Informasi yang tadinya dianggap rahasia nantinya pasti malah disebarkan. Untungnya ada pada kita. Secara tidak langsung kita berhasil menyebarkan berita tentang produk kita,” kata Danny. Setelah itu, lihat apa yang terjadi. Kalau enggak berhasil, ubah strateginya.

Dalam pemasaran lewat media sosial, dia juga menggarisbawahi bahwa setiap media sosial unik. Jadi, cara pemasaran di setiap media sosial mesti memiliki strategi yang berbeda sesuai dengan karakter penggunanya. “Jangan pukul rata. Jadi, kalau di Facebook, misalnya, jangan lalu pasang hasil scan pamflet promosi dan di-tag ke banyak nama. Kalau caranya seperti itu, pasti akan diabaikan oleh pengguna,” papar Danny.

Sementara itu, Danny juga mengemukakan pentingnya membangun engagement (keterikatan) dengan konsumen. Hal ini berguna untuk mempertahankan relasi dengan konsumen, membangun kepercayaan dan loyalitas produk. Dalam membangun engagement, memberi konsumen pengalaman bisa menjadi senjata. Ia mencontohkan aplikasi game yang digarap beberapa situs sebagai salah satu cara membangun engagement.

“Seperti jejaring sosial Facebook. Pengguna tidak pernah lupa karena pernah memainkan Farmville. Ini yang jadi satu nilai lebih untuk Facebook sekarang sehingga masih bisa bertahan,” kata Danny.

Dalam skala yang lebih luas, media sosial bisa berfungsi sebagai media untuk melakukan engagement. Media sosial menjadi cara ampuh mengetahui isu-isu tentang produk yang beredar di masyarakat. “Kalau ada produk yang dijelekin, obrolannya pasti ada di social media. Jadi, pihak perusahaan harus seperti PR (humas), harus juga memantau social media, tidak hanya koran-koran besar,” katanya.

Engagement bisa dilakukan dengan merespons secara cepat masalah yang muncul. Klarifikasi yang cepat dalam menangani masalah sangat berpengaruh pada citra produk.  KOMPAS.com. — Tri Wahono

http://tekno.kompas.com/read/2011/01/17/11475037/Kiat.Pemasaran.di.Media.Sosial

ShortMail Bantu Menulis Email Ringkas

August 25, 2011

Tulis email seringkas mungkin, hanya dengan 500 karakter sambil berjejaring sosial di Twitter.

Komunikasi yang efektif adalah yang singkat tapi mencakup semua informasi. Untuk orang tertentu, menulis email adalah kegiatan yang memakan waktu dan mengurangi produktivitas kerja. Tapi menulis email tetap perlu dilakukan untuk menciptakan hubungan baik dengan klien.

Untuk mengatasinya, Anda bisa gunakan ShortMail. Dengan ShortMail, Anda akan mampu menulis email dengan cepat, singat, dan padat. Dengan aplikasi ini email Anda dibatasi hanya 500 karakter saja, dan tidak dapat melampirkan attachment atau folder.

Untuk menggunakan ShortMail, Anda harus login lewat akun Twitter. Kalau sudah login, tulislah email seperti biasa, dengan batas karakter tadi. Kalau pekerjaan Anda menuntut Anda untuk menulis banyak email, ShortMail adalah sarana yang baik. Selain itu dengan situs ini Anda dapat bekerja sambil berjaring sosial lewat Twitter. [mor]INILAH.COM, Jakarta- Oleh Nilam P.

http://id.berita.yahoo.com/shortmail-bantu-menulis-email-ringkas-073700504.html;_ylt=Aq5uUfeuHUzF9eHdPnokTgmeV8d_;_ylu=X3oDMTNoMDYyc2xlBHBrZwM2ZDU0NjJhOS04NTM2LTNlNGMtYmU5YS01Y2JhZmFjNTE2OTgEcG9zAzE2BHNlYwNsbl9CZXJpdGFUZXJraW5pX2dhbAR2ZXIDOGJiOTc5YTAtYWI5NS0xMWUwLWJlZmItNTIxMzVmNzA2NjNi;_ylv=3

 

Gulai Daging untuk Lauk Ketupat

August 25, 2011

Selain opor ayam, gulai daging juga cocok disajikan sebagai lauk ketupat saat hari lebaran. Tekstur ketupat yang lembut sangat pas disajikan dengan gulai daging yang berkuah santan kental.  Anda bisa mengganti daging sapi dengan daging kambing, ayam atau ikan.

Gulai Daging

Bahan:
1 kg daging sapi has dalam, potong dadu 4 x 2 x 2 cm
1.500 ml santan dari 2 butir kelapa
150 g kentang, potong dadu
100 g wortel, potong dadu
1 sendok sayur minyak goreng

Bumbu:
3 lembar daun jeruk
4 buah kapulaga
3 cm kayu manis
4 buah cengkih
2 batang serai, memarkan
1 sdm air asam jawa
1 sdm bawang merah goreng

Haluskan:
8 butir bawang merah
6 siung bawang putih
6 buah cabai merah
1 sdt klabet, sangrai
1 sdt jintan, sangrai
1 sdt adas, sangrai
½  sdt lada butir, sangrai
1 sdt ketumbar, sangrai
3 cm lengkuas
2 cm kunyit

Cara Membuat:
1.    Panaskan minyak banyak, goreng potongan kentang dan wortel hingga berwarna kuning kecoklatan. Angkat. Sisihkan.

2.    Panaskan satu sendok sayur minyak goreng.  Tumis semua bumbu halus hingga harum. Masukkan potongan daging, serai, cengkih, kayu manis, kapulaga, dan daun jeruk. Masak hingga daging berubah warna.

3.    Tuang santan dan air asam jawa. Masak hingga daging empuk, kuah agak mengental dan berminyak. Sesaat sebelum diangkat, masukkan gorengan kentang dan wortel, aduk rata. Angkat.

4.    Tuang gulai ke dalam mangkuk saji.  Taburi bawang merah goreng. Sajikan hangat sebagai lauk ketupat.

Untuk 7 Porsi

Tip:  Masak gulai sambil sesekali diauk-aduk agar santan tidak terpisah dengan air dan teksturnya menggumpal.

http://id.custom.yahoo.com/ramadan/kuliner-article/gulai-daging-untuk-lauk-ketupat-2-427

 

TONIDOPLUG

July 18, 2011

THEPLUG

TonidoPlug is a tiny, low power, low cost home server based on GHz ARM processor that allows you to access your applications, files, photos, music and media from anywhere via a web browser (Powered by Tonido ® software).

TonidoPlug comes pre-installed with powerful Tonido Applications – Torrent, Photos, Jukebox, WebsharePro, Workspace, Thots, Search, Backup and Explorer – all running on top of embedded Ubuntu Jaunty Linux OS. Additionally, TonidoPlug can be extended by installing new applications from the Tonido App store.

Applications and data are always local with TonidoPlug. Your data and applications are available even if there is no internet connection available to your TonidoPlug.

When an external USB hard drive or flash drive is connected, TonidoPlug converts it to a NAS storage accessible from Windows/Mac/Linux computers in the local network.

When you are outside your local network, you can still mount TonidoPlug folders as local drives (using WebDAV) for drag-and-drop download and upload support.

Effortlessly access and share your TonidoPlug’s files through iPhone, Android or Blackberry apps.

You can even stream media to some UPnP/DLNA compliant devices like the XBOX 360 or PlayStation 3 (Beta).

FEATURES

Power efficient green com-

puting device that uses typically
5W – 13W. Electricty costs 50
cents for a month to run TonidoPlug.

TonidoPlug apps do not need
an internet connection. After
creating a Tonido profile, you
never need to be online
Run your own personal

Peer-to-Peer, workspace, torrent, file
and app server for $99

Small form factor and low cost
– suitable for homes and small
businesses

Access your TonidoPlug

using an unique URL from
anywhere via a Web browser or through native apps for iPhone, Android and Blackberry.

An array of powerful and
easy-to-use applications avail-
able now, more on the way
Create your own private
Tonido network by running
Tonido on all your machines in
your intranet or with your
friends and colleagues and
sync photos or workspace
among your group members

Extensible development plat-
form, allows new applications
to be developed on top of
Tonido

WHAT PEOPLE ARE SAYING

“I love my (Tonido)Plug!”

“I received my plug an hour ago. I set it up in less than 5 minutes. The quick start guide is great. I’m in love with this. I will definitely tell my friends about this for Christmas presents.”

“This idea of a green server is just right. I am sending a box I was using for a similar purpose to the dumpster. I am sure I will be saving some dollars and pay for the TonidoPlug in a few months.”

“For just shy of $100, the TonidoPlug sells itself. It’s an amazing little device running awesome software which supercharges your external hard drives with a ton of networked potential.”

“I highly recommend getting a TonidoPlug. It’s incredibly fast, easy to configure and use, and will use a lot less power than my PC did staying on 24/7 (serving up shares for the family).”

“I have been running one of these for a few weeks now, definitely awesome. I have all my external drives hooked up to it and sharing on my network.”

“This has got to be one of the most impressive and versatile things I’ve laid hands on.”

SPECIFICATIONS

  • Dimensions (L*W*D): 4″* 2.5″ * 2″
  • Power Requirements: 100-240V, 50/60HZ
  • Memory: 512 MB of DDR2
  • Storage: 512 MB of Flash
  • Network: Gigabit Ethernet
  • Interface: USB 2.0
  • OS: Windows, Mac and Linux
  • Browsers: IE, Firefox and Safari

PACKAGE CONTENTS

TonidoPlug

Ethernet RJ45 Cable

AC Power Cord

Quick Start Guide

 For further information visit :  http://www.tonidoplug.com/tonido_plug.html

Jabatan Fungsional Pustakawan

July 15, 2011

Jabatan Fungsional Pustakawan telah diakui eksistensinya dengan terbitnya Keputusan Menteri Negara Pendayaan Aparatur Negara (MENPAN) Nomor 18 tahun 1988 tentang Jabatan Fungsional Pustakawan dan Angka kreditnya dan kemudian dilengkapi dengan Surat Edaran Bersama (SEB) antara Kepala Perpustakaan Nasional RI dan Kepala Badan Kepegawaian Negara Nomor 53649/MPK/1998 dan Nomor 15/SE/1998. Tujuan diciptakaannya jabatan fungsional tersebut yaitu agar para pustakawan dapat meningkatkan karirnya sesuai dengan prestasi dan potensi yang dimilikinya.

Dalam rangka memberikan lahan perolehan angka kredit yang lebih luas serta mengantisipasi keluarnya Keputusan Presiden tentang Rumpun Jabatan Pegawai Negeri Sipil, pada tanggal 24 Pebruari 1998 terbit Keputusan Menteri Negara Pendayagunaan Aparatur Negara Nomor 33 Tahun 1998 tentang Jabatan Fungsional Pustakawan dan Angka Kreditnya sebagai revisi dari Kep. MENPAN Nomor 18 Tahun 1988. Keputusan MENPAN ini diikuti dengan Keputusan Bersama Kepala Perpustakaan Nasional RI dan Kepala Badan Kepegawaian Negara Nomor 07 Tahun 1998 dan Nomor 59 Tahun 1998 sebagai petunjuk pelaksanaannya.

Seiring dengan keluarnya Undang-Undang Nomor 22 Tahun 1999 tentang Pemerintah Daerah dan Keputusan Presiden Nomor 87 Tahun 1999 tentang Rumpun Jabatan, maka sebagai konsekuensi logisnya Kep. MENPAN Nomor 33 Tahun 1998 tersebut perlu direvisi dan pada tanggal 3 Desember 2002 terbit Keputusan Menteri Pendayagunaan Aparatur Negara Nomor 132/KEP/M.PAN/12/2002 tentang Jabatan Fungsional Pustakawan dan Angka Kreditnya. Keputusan Menpan ini diikuti dengan terbitnya Keputusan Bersama Kepala Perpustakaan Nasional RI dan Kepala Badan Kepegawaian Negara Nomor 23 Tahun 2003 dan Nomor 21 Tahun 2003.

Saat ini jumlah tenaga fungsional Pustakawan yang terjaring pada pangkalan data Pusat Pengembangan Pustakawan sebanyak 2.240 orang yang tersebar di berbagai perpustakaan di Indonesia. Data pustakawan yang ada saat ini menunjukkan bahwa keberadaan pejabat fungsional Pustakawan masih terkonsentrasi pada perpustakaan perguruan tinggi dan perpustakaan khusus. Pejabat fungsional yang berada di perpustakaan umum baru 36 orang (1,6%) dan perpustakaan sekolah 201 orang (8,9%). Kondisi pustakawan di kedua jenis perpustakaan tersebut mencerminkan juga perkembangan perpustakaannya yang belum berkembang seperti jenis perpustakaan lain. Untuk itu mulai tahun anggaran 2003 pengembangan pustakawan akan dititikberatkan pada pustakawan di perpustakaan umum dan sekolah. Dalam melaksanakan pembinaan dan pengembangan pustakawan tentunya harus koordinasi dengan instansi maupun unit kerja lainnya yang terkait.

Jenjang Jabatan

Jenjang jabatan fungsional pustakawan berdasarkan Keputusan Menpan No. 132/KEP/M.PAN/12/2002 terdiri dari jalur terampil dan ahli. Perbedaan kedua jalur ini didasarkan atas latar belakang pendidikan pustakawan. Jalur terampil bagi pejabat fungsional pustakawan yang berlatar belakang pendidikan D2/D3 Pusdokinfo atau D2/D3 Nonpusdokinfo ditambah diklat yang disetarakan. Sedangkan jalur ahli adalah bagi para pustakawan yang memiliki latar belakang minimal S1 Pusdokinfo atau S1 Nonpusdokinfo ditambah dengan diklat bagi pustakawan ahli.

Jalur terampil meliputi:

·  Pustakawan Pelaksana : Golongan ruang II/b, II/c dan II/d

·  Pustakawan Pelaksana Lanjutan : Golongan ruang III/a dan III/b

·  Pustakawan Penyelia : Golongan ruang III/c dan III/d

Jalur Ahli meliputi:

·  Pustakawan Pertama : Golongan ruang III/a dan III/b

·  Pustakawan Muda : Golongan ruang III/c dan III/d

·  Pustakawan Madya : Golongan ruang IV/a, IV/b dan IV/c

·  Pustakawan Utama : Golongan ruang IV/d dan IV/e

Cara Pengajuan Sebagai Pejabat Fungsional Pustakawan

Membuat Surat Permohonan untuk diangkat sebagai Pejabat Fungsional Pustakawan yang ditujukan kepada pejabat yang berwenang menetapkan Angka Kredit, dengan melampirkan :

·  Foto copy SK pengangkatan terakhir

·  Foto copy DP-3 terakhir

·  Foto copy Ijazah terakhir

·  Foto copy sertifikat Diklat Kepustakawanan bagi yang berlatar belakang non pusdokinfo.

·  Dupak beserta lampiran bukti fisik

CLOUD COMPUTING

July 4, 2011

Cloud computin

From Wikipedia, the free encyclopedia

Cloud computing conceptual diagram

Cloud computing refers to the provision of computational resources on demand via a computer network. In the traditional model of computing, both data and software are fully contained on the user’s computer; in cloud computing, the user’s computer may contain almost no software or data (perhaps a minimal operating system and web browser only), serving as little more than a display terminal for processes occurring on a network of computers far away. A common shorthand for a provider’s cloud computing service (or even an aggregation of all existing cloud services) is “The Cloud”.

The most common analogy to explain cloud computing is that of public utilities such as electricity, gas, and water. Just as centralized and standardized utilities free individuals from the vagaries of generating their own electricity or pumping their own water, cloud computing frees the user from having to deal with the physical, hardware aspects of a computer or the more mundane software maintenance tasks of possessing a physical computer in their home or office. Instead they use a share of a vast network of computers, reaping economies of scale.

The word “cloud computing” originated from the cloud symbol that is usually used by flow charts and diagrams to symbolize the internet. The principle behind the cloud is that any computer connected to the internet is connected to the same pool of computing power, applications, and files. Users can store and access their own personal files such as music, pictures, videos, and bookmarks or play games or use productivity applications on a remote server rather than physically carrying around a storage medium such as a DVD or thumb drive. Almost all users of the internet may be using a form of cloud computing though few realize it. Those who use web-based email such as Gmail or Hotmail instead of receiving mail on their computer with Outlook or Entourage are the most common examples of such users.

How it works

To understand the concept of cloud computing, in other words computing as a utility rather than a product, one must compare it to other utilities. Without public utilities such as electricity, water, and sewers, a homeowner would be responsible for all aspects of these concerns. The homeowner would have to own and maintain a generator, keeping it fueled and operational and its failure would mean a power outage. They would have to pump water from a well, purify it, and store it on their property. And they would have to collect their sewage in a tank and personally transport it to a place where it could be disposed of, or they would have to run their own personal sewage treatment plant.

Since the above scenario not only represents a great deal of work for the homeowner but completely lacks economies of scale, public utilities are by far the more common solution. Public utilities allow the homeowner to simply connect their fuse box to a power grid, and connect their home’s plumbing to both a water main and a sewage line. The power plant deals with the complexities of power generation and transport and the homeowner simply uses whatever share of the public utility’s vast resources he needs, being billed for the total.

By way of comparison, a typical home or work computer is an extraordinarily complex device and its inner workings are out of the reach of most users. A computer owner is responsible for keeping their machine functional, organizing their data, and keeping out viruses and hackers. When computing power is contained at a specialized data center, or “in the cloud”, the responsibility for performing these complicated maintenance tasks is lifted from the user. The user becomes responsible only for maintaining a very simple computer whose purpose is only to connect to the internet and allow these cloud services to take care of the rest.

When a user accesses the cloud for a popular website, many things can happen. The user’s IP address, for example, can be used to establish where the user is located (geolocation). DNS services can then direct the user to a cluster of servers that are close to the user so the site can be accessed rapidly and in the user’s local language. Users do not log in to a server, but they log in to the service they are using by obtaining a session id or a cookie, which is stored in their browser.

What the user sees in their browser usually comes from a cluster of web servers. The web servers run user interface software which collects commands from the user (mouse clicks, key presses, uploads, etc.) and interprets them. Information is then stored on or retrieved from the database servers or file servers and an updated page is displayed to the user. The data across the multiple servers is synchronised around the world for rapid global access.

Companies can use cloud computing to effectively request and use time-distributed computing resources on the fly. For example, if a company has unanticipated usage spikes above the usual workload, cloud computing can allow the company to meet the overload requirements without needing to pay for hosting a traditional infrastructure for the rest of the year. The benefits of cloud computing include that it can minimize infrastructure costs, save energy, reduce the necessity and frequency of upgrades, and lessen maintenance costs. Some heavy users of cloud computing have seen storage costs fall by 20% and networking costs reduced by 50%.

Technical description

The National Institute of Standards and Technology (NIST) provides a concise and specific definition:

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. Parallels to this concept can be drawn with the electricity grid, where end-users consume power without needing to understand the component devices or infrastucture required to provide the service.

Cloud computing describes a new supplement, consumption, and delivery model for IT services based on Internet protocols, and it typically involves provisioning of dynamically scalable and often virtualized resources[4][5] It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet. This frequently takes the form of web-based tools or applications that users can access and use through a web browser as if they were programs installed locally on their own computers.

Typical cloud computing providers deliver common business applications online that are accessed from another Web service or software like a Web browser, while the software and data are stored on servers.

Most cloud computing infrastructures consist of services delivered through common centers and built-on servers. Clouds often appear as single points of access for consumers’ computing needs. Commercial offerings are generally expected to meet quality of service (QoS) requirements of customers, and typically include service level agreements (SLAs).

Overview

Comparisons

Cloud computing derives characteristics from, but should not be confused with:

  1. Autonomic computing — “computer systems capable of self-management.”
  2. Client–server modelclient–server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients).
  3. Grid computing — “a form of distributed computing and parallel computing, whereby a ‘super and virtual computer’ is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks.”
  4. Mainframe computer — powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, enterprise resource planning, and financial transaction processing.[11]
  5. Utility computing — the “packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity.”
  6. Peer-to-peer – distributed architecture without the need for central coordination, with participants being at the same time both suppliers and consumers of resources (in contrast to the traditional client–server model).
  7. Service-oriented computing – Cloud computing provides services related to computing while, in a reciprocal manner, service-oriented computing consists of the computing techniques that operate on software-as-a-service.

Characteristics

The key characteristic of cloud computing is that the computing is “in the cloud”; that is, the processing (and the related data) is not in a specified, known or static place(s). This is in contrast to a model in which the processing takes place in one or more specific servers that are known. All the other concepts mentioned are supplementary or complementary to this concept.

Architecture

Cloud computing sample architecture

Cloud architecture, the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over application programming interfaces, usually web services and 3-tier architecture. This resembles the Unix philosophy of having multiple programs each doing one thing well and working together over universal interfaces. Complexity is controlled and the resulting systems are more manageable than their monolithic counterparts.

The two most significant components of cloud computing architecture are known as the front end and the back end. The front end is the part seen by the client, i.e. the computer user. This includes the client’s network (or computer) and the applications used to access the cloud via a user interface such as a web browser. The back end of the cloud computing architecture is the ‘cloud’ itself, comprising various computers, servers and data storage devices.

History

The term “cloud” is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network,[15] and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.

Cloud computing is a natural evolution of the widespread adoption of virtualization, service-oriented architecture, autonomic and utility computing. Details are abstracted from end-users, who no longer have need for expertise in, or control over, the technology infrastructure “in the cloud” that supports them.

The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that “computation may someday be organized as a public utility.” Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government and community forms, were thoroughly explored in Douglas Parkhill‘s 1966 book, The Challenge of the Computer Utility.

The actual term “cloud” borrows from telephony in that telecommunications companies, who until the 1990s primarily offered dedicated point-to-point data circuits, began offering Virtual Private Network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilization as they saw fit, they were able to utilize their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider, and that which was the responsibility of the user. Cloud computing extends this boundary to cover servers as well as the network infrastructure. The first scholarly use of the term “cloud computing” was in a 1997 lecture by Ramnath Chellappa.

After the dot-com bubble, Amazon played a key role in the development of cloud computing by modernizing their data centers, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving “two-pizza teams” could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.[19][20]

In 2007, Google, IBM and a number of universities embarked on a large-scale cloud computing research project. In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds. In the same year, efforts were focused on providing QoS guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project. By mid-2008, Gartner saw an opportunity for cloud computing “to shape the relationship among consumers of IT services, those who use IT services and those who sell them” and observed that “[o]rganisations are switching from company-owned hardware and software assets to per-use service-based models” so that the “projected shift to cloud computing … will result in dramatic growth in IT products in some areas and significant reductions in other areas.”

Key characteristics

  • Agility improves with users’ ability to rapidly and inexpensively re-provision technological infrastructure resources.
  • Application Programming Interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud computing systems typically use REST-based APIs.
  • Cost is claimed to be greatly reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure.[27] This ostensibly lowers barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).
  • Device and location independence enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
  • Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
    • Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
    • Peak-load capacity increases (users need not engineer for highest possible load-levels)
    • Utilization and efficiency improvements for systems that are often only 10–20% utilized.
  • Reliability is improved if multiple redundant sites are used, which makes well designed cloud computing suitable for business continuity and disaster recovery.
  • Scalability via dynamic (“on-demand”) provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads.
  • Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.
  • Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than under traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems which are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users’ desire to retain control over the infrastructure and avoid losing control of information security.
  • Maintenance of cloud computing applications is easier, because they do not need to be installed on each user’s computer. They are easier to support and to improve, as the changes reach the clients instantly.

Layers

Once an Internet Protocol connection is established among several computers, it is possible to share services within any one of the following layers.

Client

See also: Category:Cloud clients

A cloud client consists of computer hardware and/or computer software that relies on cloud computing for application delivery, or that is specifically designed for delivery of cloud services and that, in either case, is essentially useless without it. Examples include some computers, phones and other devices, operating systems and browsers.

Application

Cloud application services or “Software as a Service (SaaS)” deliver software as a service over the Internet, eliminating the need to install and run the application on the customer’s own computers and simplifying maintenance and support. People tend to use the terms “SaaS” and “cloud” interchangeably, when in fact they are two different things.[citation needed] Key characteristics include:[39][clarification needed]

  • Network-based access to, and management of, commercially available (i.e., not custom) software
  • Activities that are managed from central locations rather than at each customer’s site, enabling customers to access applications remotely via the Web
  • Application delivery that typically is closer to a one-to-many model (single instance, multi-tenant architecture) than to a one-to-one model, including architecture, pricing, partnering, and management characteristics
  • Centralized feature updating, which obviates the need for downloadable patches and upgrades

Platform

Cloud platform services or “Platform as a Service (PaaS)” deliver a computing platform and/or solution stack as a service, often consuming cloud infrastructure and sustaining cloud applications. It facilitates deployment of applications without the cost and complexity of buying and managing the underlying hardware and software layers.

Infrastructure

Cloud infrastructure services, also known as “Infrastructure as a Service (IaaS)”, delivers computer infrastructure – typically a platform virtualization environment – as a service. Rather than purchasing servers, software, data-center space or network equipment, clients instead buy those resources as a fully outsourced service. Suppliers typically bill such services on a utility computing basis and amount of resources consumed (and therefore the cost) will typically reflect the level of activity. IaaS evolved from virtual private server offerings.

Cloud infrastructure often takes the form of a tier 3 data center with many tier 4 attributes, assembled from hundreds of virtual machines.

Server

The servers layer consists of computer hardware and/or computer software products that are specifically designed for the delivery of cloud services, including multi-core processors, cloud-specific operating systems and combined offerings.

Deployment models

Cloud computing types

Public cloud

Public cloud or external cloud describes cloud computing in the traditional mainstream sense, whereby resources are dynamically provisioned on a fine-grained, self-service basis over the Internet, via web applications/web services, from an off-site third-party provider who bills on a fine-grained utility computing basis.

Community cloud

A community cloud may be established where several organizations have similar requirements and seek to share infrastructure so as to realize some of the benefits of cloud computing. The costs are spread over fewer users than a public cloud (but more than a single tenant). This option may offer a higher level of privacy, security and/or policy compliance. In addition it can be economically attractive as the resources (storage, workstations) utilized and shared in the community are already exploited and have reached their return of investment. Examples of community clouds include Google‘s “Gov Cloud”.

Hybrid cloud and hybrid IT delivery

The main responsibility of the IT department is to deliver services to the business. With the proliferation of cloud computing (both private and public) and the fact that IT departments must also deliver services via traditional, in-house methods, the newest catch-phrase has become “hybrid cloud computing.” Hybrid cloud is also called hybrid delivery by the major vendors including HP, IBM, Oracle and VMware who offer technology to manage the complexity in managing the performance, security and privacy concerns that results from the mixed delivery methods of IT services.

A hybrid storage cloud uses a combination of public and private storage clouds. Hybrid storage clouds are often useful for archiving and backup functions, allowing local data to be replicated to a public cloud.

Another perspective on deploying a web application in the cloud is using Hybrid Web Hosting, where the hosting infrastructure is a mix between cloud hosting and managed dedicated servers – this is most commonly achieved as part of a web cluster in which some of the nodes are running on real physical hardware and some are running on cloud server instances.[citation needed]

Combined cloud

Two clouds that have been joined together are more correctly called a “combined cloud”. A combined cloud environment consisting of multiple internal and/or external providers “will be typical for most enterprises”. By integrating multiple cloud services users may be able to ease the transition to public cloud services while avoiding issues such as PCI compliance.

Private cloud

Douglas Parkhill first described the concept of a “private computer utility” in his 1966 book The Challenge of the Computer Utility. The idea was based upon direct comparison with other industries (e.g. the electricity industry) and the extensive use of hybrid supply models to balance and mitigate risks.

“Private cloud” and “internal cloud” have been described as neologisms, but the concepts themselves pre-date the term cloud by 40 years. Even within modern utility industries, hybrid models still exist despite the formation of reasonably well-functioning markets and the ability to combine multiple providers.

Some vendors have used the terms to describe offerings that emulate cloud computing on private networks. These (typically virtualization automation) products offer the ability to host applications or virtual machines in a company’s own set of hosts. These provide the benefits of utility computing – shared hardware costs, the ability to recover from failure, and the ability to scale up or down depending upon demand.

Private clouds have attracted criticism because users “still have to buy, build, and manage them” and thus do not benefit from lower up-front capital costs and less hands-on management, essentially “[lacking] the economic model that makes cloud computing such an intriguing concept”.Enterprise IT organizations use their own private cloud(s) for mission critical and other operational systems to protect critical infrastructures.

Cloud engineering

Main article: Cloud engineering

Cloud engineering is the application of a systematic, disciplined, quantifiable, and interdisciplinary approach to the ideation, conceptualization, development, operation, and maintenance of cloud computing, as well as the study and applied research of the approach, i.e., the application of engineering to cloud. It is a maturing and evolving discipline to facilitate the adoption, strategization, operationalization, industrialization, standardization, productization, commoditization, and governance of cloud solutions, leading towards a cloud ecosystem[further explanation needed]. Cloud engineering is also known as cloud service engineering.

Cloud storage

Cloud storage is a model of networked computer data storage where data is stored on multiple virtual servers, generally hosted by third parties, rather than being hosted on dedicated servers. Hosting companies operate large data centers; and people who require their data to be hosted buy or lease storage capacity from them and use it for their storage needs. The data center operators, in the background, virtualize the resources according to the requirements of the customer and expose them as virtual servers, which the customers can themselves manage. Physically, the resource may span across multiple servers.

The Intercloud

The Intercloud[57] is an interconnected global “cloud of clouds” and an extension of the Internet “network of networks” on which it is based. The term was first used in the context of cloud computing in 2007 when Kevin Kelly stated that “eventually we’ll have the intercloud, the cloud of clouds. This Intercloud will have the dimensions of one machine comprising all servers and attendant cloudbooks on the planet.”. It became popular in 2009 and has also been used to describe the datacenter of the future.

The Intercloud scenario is based on the key concept that each single cloud does not have infinite physical resources. If a cloud saturates the computational and storage resources of its virtualization infrastructure, it could not be able to satisfy further requests for service allocations sent from its clients. The Intercloud scenario aims to address such situation, and in theory, each cloud can use the computational and storage resources of the virtualization infrastructures of other clouds. Such form of pay-for-use may introduce new business opportunities among cloud providers if they manage to go beyond theoretical framework. Nevertheless, the Intercloud raises many more challenges than solutions concerning cloud federation, security, interoperability, quality of service, vendor’s lock-ins, trust, legal issues, monitoring and billing.[citation needed]

The concept of a competitive utility computing market which combined many computer utilities together was originally described by Douglas Parkhill in his 1966 book, the “Challenge of the Computer Utility”. This concept has been subsequently used many times over the last 40 years and is identical to the Intercloud.

Issues

Privacy

The cloud model has been criticized by privacy advocates for the greater ease in which the companies hosting the cloud services control, and thus, can monitor at will, lawfully or unlawfully, the communication and data stored between the user and the host company. Instances such as the secret NSA program, working with AT&T, and Verizon, which recorded over 10 million phone calls between American citizens, causes uncertainty among privacy advocates, and the greater powers it gives to telecommunication companies to monitor user activity. While there have been efforts (such as US-EU Safe Harbor) to “harmonize” the legal environment, providers such as Amazon still cater to major markets (typically the United States and the European Union) by deploying local infrastructure and allowing customers to select “availability zones.”

Compliance

In order to obtain compliance with regulations including FISMA, HIPAA and SOX in the United States, the Data Protection Directive in the EU and the credit card industry’s PCI DSS, users may have to adopt community or hybrid deployment modes which are typically more expensive and may offer restricted benefits. This is how Google is able to “manage and meet additional government policy requirements beyond FISMA”[65][66] and Rackspace Cloud are able to claim PCI compliance.[67] Customers in the EU contracting with cloud providers established outside the EU/EEA have to adhere to the EU regulations on export of personal data.

Many providers also obtain SAS 70 Type II certification (e.g. Amazon, Salesforce.com, Google and Microsoft), but this has been criticised on the grounds that the hand-picked set of goals and standards determined by the auditor and the auditee are often not disclosed and can vary widely. Providers typically make this information available on request, under non-disclosure agreement.

Legal

In March 2007, Dell applied to trademark the term “cloud computing” (U.S. Trademark 77,139,082) in the United States. The “Notice of Allowance” the company received in July 2008 was canceled in August, resulting in a formal rejection of the trademark application less than a week later. Since 2007, the number of trademark filings covering cloud computing brands, goods and services has increased at an almost exponential rate. As companies sought to better position themselves for cloud computing branding and marketing efforts, cloud computing trademark filings increased by 483% between 2008 and 2009. In 2009, 116 cloud computing trademarks were filed, and trademark analysts predict that over 500 such marks could be filed during 2010.

Other legal cases may shape the use of cloud computing by the public sector. On October 29, 2010, Google filed a lawsuit against the U.S. Department of Interior, which opened up a bid for software that required that bidders use Microsoft’s Business Productivity Online Suite. Google sued, calling the requirement “unduly restrictive of competition.” Scholars have pointed out that, beginning in 2005, the prevalence of open standards and open source may have an impact on the way that public entities choose to select vendors.

Open source

Open source software has provided the foundation for many cloud computing implementations. In November 2007, the Free Software Foundation released the Affero General Public License, a version of GPLv3 intended to close a perceived legal loophole associated with free software designed to be run over a network.

Open standards

Most cloud providers expose APIs which are typically well-documented (often under a Creative Commons license) but also unique to their implementation and thus not interoperable. Some vendors have adopted others’ APIs and there are a number of open standards under development, including the OGF‘s Open Cloud Computing Interface. The Open Cloud Consortium (OCC) is working to develop consensus on early cloud computing standards and practices.

Security

The relative security of cloud computing services is a contentious issue which may be delaying its adoption. Issues barring the adoption of cloud computing are due in large part to the private and public sectors unease surrounding the external management of security based services. It is the very nature of cloud computing based services, private or public, that promote external management of provided services. This delivers great incentive amongst cloud computing service providers in producing a priority in building and maintaining strong management of secure services.

Organizations have been formed in order to provide standards for a better future in cloud computing services. One organization in particular, the Cloud Security Alliance is a non-profit organization formed to promote the use of best practices for providing security assurance within cloud computing.

Availability and performance

In addition to concerns about security, businesses are also worried about acceptable levels of availability and performance of applications hosted in the cloud.

There are also concerns about a cloud provider shutting down for financial or legal reasons, which has happened in a number of cases.

Sustainability and siting

Although cloud computing is often assumed to be a form of “green computing“, there is as of yet no published study to substantiate this assumption. Siting the servers affects the environmental effects of cloud computing. In areas where climate favors natural cooling and renewable electricity is readily available, the environmental effects will be more moderate. Thus countries with favorable conditions, such as Finland, Sweden and Switzerland, are trying to attract cloud computing data centers.

SmartBay, marine research infrastructure of sensors and computational technology, is being developed using cloud computing, an emerging approach to shared infrastructure in which large pools of systems are linked together to provide IT services.

Research

A number of universities, vendors and government organizations are investing in research around the topic of cloud computing.Academic institutions include University of Melbourne (Australia), Georgia Tech, Yale, Wayne State, Virginia Tech, University of Wisconsin–Madison, Carnegie Mellon, MIT, Indiana University, University of Massachusetts, University of Maryland, IIT Bombay, North Carolina State University, Purdue University, University of California, University of Washington, University of Virginia, University of Utah, University of Minnesota, among others.

Joint government, academic and vendor collaborative research projects include the IBM/Google Academic Cloud Computing Initiative (ACCI). In October 2007 IBM and Google announced the multi- university project designed to enhance students’ technical knowledge to address the challenges of cloud computing. In April 2009, the National Science Foundation joined the ACCI and awarded approximately $5 million in grants to 14 academic institutions.

In July 2008, HP, Intel Corporation and Yahoo! announced the creation of a global, multi-data center, open source test bed, called Open Cirrus, designed to encourage research into all aspects of cloud computing, service and data center management. Open Cirrus partners include the NSF, the University of Illinois (UIUC), Karlsruhe Institute of Technology, the Infocomm Development Authority (IDA) of Singapore, the Electronics and Telecommunications Research Institute (ETRI) in Korea, the Malaysian Institute for Microelectronic Systems(MIMOS), and the Institute for System Programming at the Russian Academy of Sciences (ISPRAS). In Sept. 2010, more researchers joined the HP/Intel/Yahoo Open Cirrus project for cloud computing research. The new researchers are China Mobile Research Institute (CMRI), Spain’s Supercomputing Center of Galicia (CESGA by its Spanish acronym), Georgia Tech’s Center for Experimental Research in Computer Systems (CERCS) and China Telecom.

In July 2010, HP Labs India announced a new cloud-based technology designed to simplify taking content and making it mobile-enabled, even from low-end devices. Called SiteonMobile, the new technology is designed for emerging markets where people are more likely to access the internet via mobile phones rather than computers. In Nov. 2010, HP formally opened its Government Cloud Theatre, located at the HP Labs site in Bristol, England.The demonstration facility highlights high-security, highly flexible cloud computing based on intellectual property developed at HP Labs. The aim of the facility is to lessen fears about the security of the cloud. HP Labs Bristol is HP’s second-largest central research location and currently is responsible for researching cloud computing and security.

The IEEE Technical Committee on Services Computing in IEEE Computer Society sponsors the IEEE International Conference on Cloud Computing (CLOUD).CLOUD 2010 was held on July 5–10, 2010 in Miami, Florida

On March 23, 2011, Google, Microsoft, HP, Yahoo, Verizon, Deutsche Telecom and 17 other companies formed a nonprofit organization called Open Networking Foundation, focused on providing support for a new cloud initiative called Software-Defined Networking. The initiative is meant to speed innovation through simple software changes in telecommunications networks, wireless networks, data centers and other networking areas.

Criticism of the term

Some have come to criticize the term as being either too unspecific or even misleading. CEO Larry Ellison of Oracle Corporation asserts that cloud computing is “everything that we already do”, claiming that the company could simply “change the wording on some of our ads” to deploy their cloud-based services.[109][110][111][112][113] Forrester Research VP Frank Gillett questions the very nature of and motivation behind the push for cloud computing, describing what he calls “cloud washing” in the industry whereby companies relabel their products as cloud computing resulting in a lot of marketing innovation on top of real innovation.[114][115] GNU‘s Richard Stallman insists that the industry will only use the model to deliver services at ever increasing rates over proprietary systems, otherwise likening it to a “marketing hype campaign”.

http://en.wikipedia.org/wiki/Cloud_computing

 

Librarian

July 4, 2011

Librarian

A librarian is an information professional trained in library and information science, which is the organization and management of information services or materials for those with information needs. Typically, librarians work in a public or college library, an elementary or secondary school media center, a library within a business or company, or another information-provision agency like a hospital or law firm. Some librarians are independent entrepreneurs working as information specialists, catalogers, indexers and other professional, specialized capacities. Librarians may be categorized as a public, school, correctional, special, independent or academic librarian.

Outline, requirements and positions

Traditionally, librarians have been associated with collections of books, as demonstrated by the etymology of the word “librarian” (< Latin liber, ‘book’). However, modern librarians deal with information in many formats, including books, magazines, newspapers, audio recordings (both musical and spoken-word), video recordings, maps, manuscripts, photographs and other graphic material, bibliographic databases, web searching, and digital resources. Librarians often provide other information services, including computer provision and training, coordination of public programs, basic literacy education, assistive equipment for people with disabilities, and help with finding and using community resources.

Librarian roles and duties

Specific duties vary depending on the size and type of library. Olivia Crosby described librarians as “Information experts in the information age”.[1] Most librarians spend their time working in one of the following areas of a library:

  • Public service librarians work with the public, frequently at the reference desk of lending libraries. Some specialize in serving adults or children. Children’s librarians provide appropriate material for children at all age levels, include pre-readers, conduct specialized programs and work with the children (and often their parents) to help foster interest and competence in the young reader. (In larger libraries, some specialize in teen services, periodicals, or other special collections.)
  • Reference or research librarians help people doing research to find the information they need, through a structured conversation called a reference interview. The help may take the form of research on a specific question, providing direction on the use of databases and other electronic information resources; obtaining specialized materials from other sources; or providing access to and care of delicate or expensive materials. These services are sometimes provided by other library staff that have been given a certain amount of special training; some have criticized this trend.[2]
  • Technical service librarians work “behind the scenes” ordering library materials and database subscriptions, computers and other equipment, and supervise the cataloging and physical processing of new materials.
  • Collections development librarians monitor the selection of books and electronic resources. Large libraries often use approval plans, which involve the librarian for a specific subject creating a profile that allows publishers to send relevant books to the library without any additional vetting. Librarians can then see those books when they arrive and decide if they will become part of the collection or not. All collections librarians also have a certain amount of funding to allow them to purchase books and materials that don’t arrive via approval.
  • Archivists can be specialized librarians who deal with archival materials, such as manuscripts, documents and records, though this varies from country to country, and there are other routes to the archival profession.
  • Systems Librarians develop, troubleshoot and maintain library systems, including the library catalog and related systems.
  • Electronic Resources Librarians manage the databases that libraries license from third-party vendors.
  • School Librarians work in school libraries and perform duties as teachers, information technology specialists, and advocates for literacy.
  • Outreach Librarians are charged with providing library and information services for underrepresented groups, such as people with disabilities, low income neighborhoods, home bound adults and seniors, incarcerated and ex-offenders, and homeless and rural communities. In academic libraries, outreach librarians might focus on high school students, transfer students, first-generation college students, and minorities.
  • Instruction Librarians teach information literacy skills in face-to-face classes and/or through the creation of online learning objects. They instruct library users on how to find, evaluate and use information effectively. They are most common in academic libraries.

Experienced librarians may take administrative positions such as library or information center director. Similar to the management of any other organization, they are concerned with the long-term planning of the library, and its relationship with its parent organization (the city or county for a public library, the college/university for an academic library, or the organization served by a special library). In smaller or specialized libraries, librarians typically perform a wide range of the different duties.

Salaries and benefits have improved somewhat in recent years, even in an era of budget tightening and reductions in operating expenses at many libraries. They can vary considerably depending upon the geographic region, the level of funding and support (it is usually better in major academic libraries and government facilities than it is in inner-city school or public libraries), the type of library (a small public or school library versus a large government or academic library), and the position (a beginning librarian versus a department head). Starting salaries at small public libraries can range from $20,000-$25,000; high profile positions like director or department head can approach or exceed $100,000 at major academic and large government libraries and some public libraries. Librarians who are paid faculty salaries at a major university (especially if they have a second academic degree), who have an education degree at a school library, who are in administration at a library, or who are in a government library post tend to have higher incomes, especially with experience and better language and technical skills. Despite this, librarians are still wrongly perceived as low-level pink collar professionals. In reality, the technical competencies and information-seeking skills needed for the job are becoming increasingly important and are relevant to the contemporary economy, and such positions are thus becoming more prominent.

Representative examples of librarian responsibilities:

  • Researching topics of interest for their constituencies.
  • Referring patrons to other community organizations and government offices.
  • Suggesting appropriate books (“readers’ advisory”) for children of different reading levels, and recommending novels for recreational reading.
  • Facilitating and promoting reading clubs.
  • Developing programs for library users of all ages and backgrounds.
  • Managing access to electronic information resources.
  • Building collections to respond to changing community needs or demands
  • Writing grants to gain funding for expanded program or collections
  • Digitizing collections for online access
  • Answering incoming reference questions via telephone, postal mail, email, fax, and chat
  • Making and enforcing computer appointments on the public access Internet computers.[3]

http://www.pemustaka.com/pustakawan