Jerusalem: locating the colonies and neighborhoods

From FDHwiki
Jump to navigation Jump to search

Introduction

The goal of this project is to study the construction of neighborhoods in Jerusalem over time. We collect information about Jerusalem neighborhoods from four different sources, including the book Jerusalem and its Environs, the Wikipedia category Neighbourhoods of Jerusalem, the Wikipedia list Places of Jerusalem - Neighborhoods and Wikidata entity neighborhood of Jerusalem. These sources provide us with different information with different focuses. We merge this content through matching methods and present it on a web page. The page organizes and visualizes all the information. With the timeline, search, and view details features, users can get a clear picture of the Jerusalem community in our map interface. At the same time, the matching approach we use can be easily applied to other cities with multiple sources of information (similar, different, or even contradictory), with the potential for reuse in the future.

Motivation

The study of the geography and chronology of neighborhoods in Jerusalem can provide valuable insights into the city's past and present. The location of a neighborhood can often reflect the social, economic, and political forces that shaped it, as well as the cultural traditions and values of its residents.

Examining the founding year of a neighborhood can also provide insight into the city's history and development. Visualizing the location and founded year of neighborhoods in Jerusalem can be a powerful tool for understanding the city's past and present. By mapping and analyzing these data, it is possible to gain a deeper understanding of the cultural, social, and economic dynamics of different neighborhoods and the forces that have shaped them.

A city with such a rich and varied history as Jerusalem has many different accounts of it. These accounts from various sources are an important basis when studying it. How to integrate the information from these sources is also one of the focuses of our research.

Deliverables

  • OCR results of Development of Jerusalem neighborhoods information from Jerusalem and its Environs.
  • Crawler results from Wikipedia category Neighbourhoods of Jerusalem, Wikipedia list Places of Jerusalem - Neighborhoods, and Wikidata entity neighborhood of Jerusalem.
  • Integrated database with multiple information sources after perfect matching and fuzzy matching.
  • An interactive and user-friendly webpage showing the changes in neighborhoods of Jerusalem with time, which contains:
    • A timeline feature that illustrates the evolution of the construction of neighborhoods in Jerusalem over time.
    • A search function that enables users to search for neighborhoods by name.
    • A dedicated sub-page that contains relevant information for each neighborhood.

Methodology

Data collection

OCR method for paper book

Jerusalem and its Environs is a book written by TODO in TODO that provides detailed information about the development of Jerusalem neighborhoods throughout different time periods. We utilized OCR technology to scan relevant information from the book, and also conducted manual proofreading to ensure the accuracy of the data due to the presence of punctuations and annotations in some community names. Data from the book includes information about the name, year of foundation, number of inhabitants, and initiating entity. Remarks are also included in some cases.

This source gives us the year of foundation for Jerusalem neighborhoods, a crucial aspect of our study. However, not every neighborhood has a precise year of construction. For neighborhoods with foundation years that are intervals and for those with ambiguous construction years (e.g., 1900s or end of Mandate), we have chosen the first year of the period for further analysis.

The following is an example of the comparison of raw data, OCR result, and manually checked result:

TODO

Crawler method for Wikipedia and Wikidata information

We employ API of Wikidata, requests package and BeautifulSoup parser package to implement a crawler to retrieve data from the internet (primarily Wikipedia and Wikidata). For data on Wikipedia, we mainly focus on the coordinates of the neighborhood. For data on Wikidata, we put particular attention to neighborhoods with 'inception' attribute, which serves as another source for the founding year of the neighborhoods.

It should be noted that there is a significant amount of overlap in the internet information, requiring further data cleaning and matching. We store this data in different dataframes for further processing.

The following is an example of data respectively from the Wikipedia category Neighbourhoods of Jerusalem, the Wikipedia list Places of Jerusalem - Neighborhoods and Wikidata entity neighborhood of Jerusalem.

TODO

Data matching

Matchng data from Wikipedia sources

To deal with the overlap,


Matchng data from the book

Database establishment

Webpage development

Search function

Timeline feature

Result assessment

Limitations and Further Work

Limitations

Limitations due to the lack of data


Further Work

  • More Precise Georeferencing and Pixel Matching

We only choose ten ground control points for QGIS to georeference the raster height map. Generally, The more points you select, the more accurate the image is registered to the target coordinates of the cadaster. In the future, after choosing enough ground control points, we will compare our Venice elevation map with the cadaster pixel by pixel in the same resolution. Both maps use the same color and grey scales.

  • Building Contour Sharpening

In our Venice elevation map, the building footprint is blurred and not well detected in some areas. In the future, we will use Shapely in Python to find the polygons and OpenCV to implement contour approximation.

Project Plan and Milestones

Date Task Completion
By Week 3
  • Brainstorm project ideas.
  • Prepare slides for initial project idea presentation.
By Week 4
  • Organize the Jerusalem neighborhood information from the book into csv files by OCR.
  • Conduct manual review and adjust formats.
By Week 5
  • Get neighborhood information on Wikipedia through crawlers, including names, links, and coordinates.
By Week 6
  • Merge the data get from wikipedia different website.
  • Extract the same neighborhoods in both book and wikipedia.
By Week 7
  • Start working on webpages.
  • Decide to use GitHub pages and bootstrap as our output methods and learn the basic concepts.
By Week 8
  • Use fuzzy matching method to link information from the book and Wikipedia.
  • Work on webpage: use leaflet to present maps.
By Week 9
  • Transfer data into usable format for HTML.
  • Combine the front-end webpage and back-end data.
  • Create our first demo webpage.
By Week 10
  • Fill in information on wiki.
  • Get prepared for the midterm presentation.
By Week 11
  • Get neighborhood information on Wikidata through crawlers, including names, links, and establish time.
  • Merge the information from Wikidata to existing data.
  • Find out a way to deal with duplicated data and extract results from fuzzy matching.
By Week 12
  • Find neighborhoods with area shapes and find out a way to visualize.
  • Create searching function on the website.
  • Adding information and adjust the website.
By Week 13
  • Create another page to list the information on the website.
  • Create Github Pages for our webpage.
By Week 14
  • Complete the wiki on motivation, methods, results...
  • Refine the visualization of our webpage
By Week 15
  • Final presentation

Github Repository

https://github.com/WayerLiu/fdh_jerusalem.github.io