Main Page: Difference between revisions
No edit summary |
|||
Line 6: | Line 6: | ||
==summary== | ==summary== | ||
This course | |||
This course gives an introduction to the fundamental concepts and methods of the Digital Humanities, both from a theoretical and applied point of view. The course introduces the Digital Humanities circle of processing and interpretation, from data acquisition to new understandings. The first part of the course presents the technical pipelines for digitising, analysing and modelling written documents (printed and handwritten), maps, photographs and 3d objects and environments. The second part of the course details the principles of the most important algorithms for document processing (layout analysis, deep learning methods), knowledge modelling (semantic web, ontologies, graph databases) generative models and simulation (rule-based inference, deep learning based generation). The third part of the course focuses on platform management from the points of view of data, users and bots. Students will practise the skills they learn directly analysing and interpreting Cultural Datasets from ongoing large-scale research projects (Venice Time Machine, Swiss newspaper archives). | |||
==Plan == | ==Plan == |
Revision as of 21:43, 11 September 2017
Welcome to the wiki of the course Foundation of Digital Humanities (DH-405).
Contact
Professor: Frédéric Kaplan
assistants: ...
summary
This course gives an introduction to the fundamental concepts and methods of the Digital Humanities, both from a theoretical and applied point of view. The course introduces the Digital Humanities circle of processing and interpretation, from data acquisition to new understandings. The first part of the course presents the technical pipelines for digitising, analysing and modelling written documents (printed and handwritten), maps, photographs and 3d objects and environments. The second part of the course details the principles of the most important algorithms for document processing (layout analysis, deep learning methods), knowledge modelling (semantic web, ontologies, graph databases) generative models and simulation (rule-based inference, deep learning based generation). The third part of the course focuses on platform management from the points of view of data, users and bots. Students will practise the skills they learn directly analysing and interpreting Cultural Datasets from ongoing large-scale research projects (Venice Time Machine, Swiss newspaper archives).