Blog Archives

Introduction to GeoWeb Technologies – Week 1 – 5 Readings

Weeks 1-5 Readings and Slides – General Geographic and Cartographic Competencies

UT-Knoxville School of Information Sciences Course Options (2013-2014)

Sometimes I am critical of the  courses offered through my graduate program.
In truth, there is a decent balance of specialty coursework serving a diverse range of career paths for information professionals in-training.
I found this list of course offerings on the 2013 – 2014 graduate catalog for the College of Communication and Information.  I thought it’d be worthwhile to preserve them as a record of options I’ve had to choose from.
•  INSC 450 – Writing About Science and Medicine
•  INSC 500 – Thesis
•  INSC 502 – Registration for Use of Facilities
•  INSC 504 – Research Methods in Information Sciences
•  INSC 505 – ePortfolio
•  INSC 510 – Information Environment
•  INSC 520 – Information Representation and Organization
•  INSC 521 – Cataloging and Classification
•  INSC 522 – Cataloging of Non-print Materials
•  INSC 523 – Abstracting and Indexing
•  INSC 530 – Information Access and Retrieval
•  INSC 531 – Sources and Services for the Social Sciences
•  INSC 532 – Sources and Services for Science and Engineering
•  INSC 533 – Sources and Services for the Humanities
•  INSC 534 – Government Information Sources
•  INSC 535 – Advanced Information Retrieval
•  INSC 541 – Knowledge Management for Information Professionals
•  INSC 542 – Social Informatics
•  INSC 544 – Business Intelligence for Information Professionals
•  INSC 545 – Scientific and Technical Communications
•  INSC 546 – Environmental Informatics
•  INSC 547 – Health Sciences Information Centers
•  INSC 548 – Federal Libraries and Information Centers
•  INSC 550 – Management of Information Organizations
•  INSC 551 – School Library Media Centers
•  INSC 552 – Academic Libraries
•  INSC 553 – Specialized Information Agencies and Services
•  INSC 554 – Public Library Management and Services
•  INSC 557 – User Instruction
•  INSC 559 – Grant Development for Information Professionals
•  INSC 560 – Development and Management of Collections
•  INSC 564 – Archives and Records Management
•  INSC 565 – Digital Libraries
•  INSC 571 – Resources and Services for Children
•  INSC 572 – Resources and Services for Young Adults
•  INSC 573 – Programming for Children and Young Adults
•  INSC 574 – Resources and Services for Adults
•  INSC 575 – Valuing Diversity: International and Intercultural Resources for Youth
•  INSC 576 – Storytelling in Libraries and Classrooms
•  INSC 577 – Picture Books Across the Curriculum
•  INSC 581 – Information Networking Applications
•  INSC 582 – Information Systems Design and Implementation
•  INSC 584 – Database Management Systems
•  INSC 585 – Information Technologies
•  INSC 587 – Mining the Web
•  INSC 588 – Human-Computer Interaction
•  INSC 590 – Problems in Information Sciences
•  INSC 591 – Independent Project or Research
•  INSC 592 – Big Data Analytics
•  INSC 594 – Graduate Research Participation
•  INSC 595 – Student Teaching in School Library Information Center
•  INSC 596 – Field-Based Experience in School Library Information Centers
•  INSC 597 – Information Architecture
•  INSC 598 – Web Design
•  INSC 599 – Practicum
•  INSC 680 – Information Science Theory

Big Data Analytics Coursework

Big Data Analytics is listed as IS 592 in catalog.  Because it is a higher level class building on material from IS 584 (Database Management Systems), I’m looking forward to exploring data analytics and data science in the formal classroom setting.

Data life cycle image with steps analyze plan collect assure describe preserve discover integrate analyze.

Steps in data life cycle include “Analyze”

I’m grateful to have the opportunity to study this topic at the School of Information Sciences.  It is one of a handful of courses in the overall course catalog that are closely aligned with my career goals to work as a professional data manager.  My intuition is that individuals who are trained to manage data must have an extensive knowledge of the data life cycle to effectively manage data across the spectrum of its life.  Note that “Analyze” is one of the last steps.  Even for strictly traditional views of curation, an archivist who is not familiar with flow of information in the “Analyze” step is not well positioned to receive and curate research output that is increasingly data intensive.

From the first lecture, a quote caught my attention:

We project a need for 1.5 million additional managers and analysts in the United States who can ask the right questions and consume the results of the analysis of Big Data effectively.

This is a huge and growing field.  What caught my eye is the growing number of specialty schools to turn out students with the skills to analyze data.  Even at UT-Knoxville, there are three “data science” courses – one in Computer Science, one in Statistics, and one in Information Sciences.  These are three separate colleges (Engineering, Arts & Sciences, and Communication and Information). Established information studies programs are likely seats for new data science programs and curriculum.  Berkely’s school of information just stood up a new program to support a Masters in Information and Data Science.  Other short “boot camps” are offered, although I’m not sure if those programs will produce data “scientists” or just data “analysts” – the key from the quote above is “ask the right questions.” Are our new data science programs able to impart the skill to ask questions using the scientific method?

From the first lecture, here are skills that data analysts need:

  1. Database
  2. Data mining
  3. Statistical applications
  4. Predictive analytics
  5. Business intelligence
  6. Data modeling and data visualization
  7. Meta cognitive strategies
  8. Interpersonal

Coursework for IS 592 will be collected at the following URL: https://mountainsol.wordpress.com/category/coursework/big-data-analtyics/

Geocoding Historic Homes with Google Fusion Tables

Using data available from Wikipedia concerning historic homes constructed near the turn of 19th and 20th century, I have created a map of structures in Knoxville, Tennessee designed by George A. Barber, an architect.

https://www.google.com/fusiontables/embedviz?q=select+col1+from+1MIXz_F_4LmZVPhtrEdfps6V7Ph9LYjAG4I_z6SQ&viz=MAP&h=false&lat=35.98360461230385&lng=-83.90514483678731&t=1&z=15&l=col1&y=2&tmplt=2&hml=GEOCODABLE

I pulled the data from <http://en.wikipedia.org/wiki/List_of_George_Franklin_Barber_works> as a simple “copy” and “paste” operation into Apache Open Office Calc spreadsheet.

I saved the spreadsheet as a .csv file, comma delimited.

I added a new column and duplicated the street addresses. I deleted the parantheticals surrounding the street address, along with the name of the property.

I deleted the street address and parenthetical in column 1 to retain the name of the property.

After saving the .csv file again, I opened up my personal Google Drive account.

I added the “Google Fusion Tables” application from Google, and then selected “create new fusion table” as instructed in Google’s tutorial.

http://en.wikipedia.org/wiki/List_of_George_Franklin_Barber_works

After importing the data, I ran into some problems concerning the division of street, city, and state.  From “File > Geocode” my “street” was not recognized immediately as a location address.  After changing the “street” drop down in the “Rows 1” view to “location,” I was able to direct the application to geocode based on the street address.

At present time, this is a very basic map.

I do like the ease with which it obtained the lat/long coordinates, and how it transformed the table data into “cards” with the pertinent information in a “pop-up” on the map.

I’m also happy that it can export the resultant geocoded map as KML.

For future work, I think it would be interesting to link a Flickr or other photo management system to the Geocoder.

I also understand it is possible to add a Google Street view image of the particular property.

https://support.google.com/fusiontables/answer/2796058?hl=en

However, it is necessary to obtain the location information in the form of lat/long for this to work.

It is unfortunate that Fusion Tables do not append the lat – long information to the table.

There is software available which can provide this information.

From my course in the Geography Department, I’m aware of this software:

http://ctasgis02.psur.utk.edu/credapopulation/freeware.htm

The application of interest is listed under “Google Geocoder.”

Geocoding with Google Earth is accomplished through two programs: KMLGeocode and KMLReport. The first program reads Excel Worksheets or an XML export of a table from a relational database system and creates a KML file that can be loaded into Google Earth. Once the KML file is loaded, Google Earth will attempt to geocode each entry in it.  After the file is geocoded, it can be saved to a new KML file. This file will contain the coordinates of each

address found. The second program, KMLReport, reads that file and generates two files: one for

geocoded addresses and one for addresses that were not found. The file for geocoded address is written as a comma delimited text file that can be loaded into ArcGIS.

At the moment it seems like obtaining a street view would require me to obtain the lat-long coordinates for the data, the append it to the Fusion Table.

Fusion has some advantages, including automatic publishing to the web, the ability to easily update table data, and for “collaborative data entry.” I can see some potential applications for my neighborhood organization, or any other collaborative group with limited access to mapping technology (especially a library system or other local municipality that does not have thousands of dollars to spend on ESRI software).