loleg / dokuwiki-projects

Documentation of projects in the make.opendata.ch Dokuwiki instance


Contributors loleg

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-3.6.2 -----> Installing pip -----> Installing requirements with pip  Collecting beautifulsoup4==4.6.3  Downloading beautifulsoup4-4.6.3-py3-none-any.whl (90 kB)  Collecting requests==2.20.0  Downloading requests-2.20.0-py2.py3-none-any.whl (60 kB)  Collecting python-dateutil==2.7.0  Downloading python_dateutil-2.7.0-py2.py3-none-any.whl (207 kB)  Collecting idna<2.8,>=2.5  Downloading idna-2.7-py2.py3-none-any.whl (58 kB)  Collecting certifi>=2017.4.17  Downloading certifi-2020.11.8-py2.py3-none-any.whl (155 kB)  Collecting urllib3<1.25,>=1.21.1  Downloading urllib3-1.24.3-py2.py3-none-any.whl (118 kB)  Collecting chardet<3.1.0,>=3.0.2  Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)  Collecting six>=1.5  Downloading six-1.15.0-py2.py3-none-any.whl (10 kB)  Installing collected packages: beautifulsoup4, idna, certifi, urllib3, chardet, requests, six, python-dateutil  Successfully installed beautifulsoup4-4.6.3 certifi-2020.11.8 chardet-3.0.4 idna-2.7 python-dateutil-2.7.0 requests-2.20.0 six-1.15.0 urllib3-1.24.3   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Refreshing start page https://make.opendata.ch/wiki/event:home Retrieving event: Open Cultural Data Hackathon / https://make.opendata.ch/wiki/event:2021-04 Retrieving event: Open Cultural Data Hackathon / https://make.opendata.ch/wiki/event:2020-06 Retrieving event: GLAM mix'n'hack 2019 / https://make.opendata.ch/wiki/event:2019-09 Downloading https://make.opendata.ch/wiki/project:back_to_the_greek_universe Back to the Greek Universe 2020-02-06 14:44:00 4195 Downloading https://make.opendata.ch/wiki/project:covimas CoViMAS 2019-09-08 15:17:00 4968 Downloading https://make.opendata.ch/wiki/project:opera_forever Opera Forever 2019-10-31 17:38:00 3570 Downloading https://make.opendata.ch/wiki/project:time_gazer Time Gazer 2019-09-22 11:34:00 2013 Downloading https://make.opendata.ch/wiki/project:human_name_creativity Human Name Creativity 2019-09-24 01:34:00 2906 Retrieving event: Open Cultural Data Hackathon / https://make.opendata.ch/wiki/event:2018-10 Downloading https://make.opendata.ch/wiki/project:associative_image_search Art on Paper Gallery 2018-10-28 12:59:00 1701 Downloading https://make.opendata.ch/wiki/project:personal_museum_guide Artify 2018-10-28 13:25:00 1775 Downloading https://make.opendata.ch/wiki/project:web_exhibition Ask the Artist 2018-10-29 14:51:00 2784 Downloading https://make.opendata.ch/wiki/project:dncsonyc Dog Name Creativity Survey of New York City 2018-10-28 15:42:00 2394 Downloading https://make.opendata.ch/wiki/project:find_me_an_exhibit Find Me an Exhibit 2018-10-27 20:36:00 742 Downloading https://make.opendata.ch/wiki/project:letterjongg Letterjongg (Adaptation of the Mahjong Game) 2019-07-26 21:28:00 3878 Downloading https://make.opendata.ch/wiki/project:graph_queries New Frontiers in Graph Queries 2018-10-27 22:26:00 2121 Downloading https://make.opendata.ch/wiki/project:sex_and_crime Sex, Crime & Pub Brawls in Early Modern Zurich 2018-10-27 11:59:00 862 Downloading https://make.opendata.ch/wiki/project:swiss_art_stories_on_twitter Swiss Art Stories on Twitter 2018-10-28 13:35:00 1817 Downloading https://make.opendata.ch/wiki/project:view_find View Find (Library Catalog Search) 2018-10-27 13:58:00 367 Downloading https://make.opendata.ch/wiki/project:vrvisits_zurich VR Visits Zurich 2018-11-01 16:25:00 585 Downloading https://make.opendata.ch/wiki/project:virtual_3d_exhibition Walking Around the Globe – 3D Picture Exhibition 2019-05-08 17:15:00 5239 Downloading https://make.opendata.ch/wiki/project:weartonauts We-Art-o-nauts: How to Provide a Better Art Experience 2018-10-28 22:49:00 1268 Downloading https://make.opendata.ch/wiki/project:mutilingual_data_search Wikidata-driven Multilingual Search in the Library Catalogue 2018-11-20 08:25:00 6209 Downloading https://make.opendata.ch/wiki/project:historical_tours Zurich Historical Photo Tours (formerly: Historical Tours based on Geo-data) 2018-10-31 08:27:00 2122 Retrieving event: Open Cultural Data Hackathon / https://make.opendata.ch/wiki/event:2017-09 Downloading https://make.opendata.ch/wiki/project:league_of_nations_pictures_annotation Collaborative Face Recognition and Picture Annotation for Archives 2017-09-16 18:35:00 5738 Downloading https://make.opendata.ch/wiki/project:jung_rilke_correspondance_network Jung - Rilke Correspondance Network 2017-10-26 16:36:00 4623 Downloading https://make.opendata.ch/wiki/project:schauspielhauswikidata Schauspielhaus Zürich performances in Wikidata 2020-03-17 11:55:00 1976 Downloading https://make.opendata.ch/wiki/project:swissvideogamesdirectory Swiss Video Game Directory 2017-09-27 17:37:00 1639 Downloading https://make.opendata.ch/wiki/project:big_data_analytics Big Data Analytics (bibliographical data) 2017-09-18 10:12:00 2648 Downloading https://make.opendata.ch/wiki/project:medicalhistorycollections Medical History Collection 2019-04-09 16:02:00 1715 Downloading https://make.opendata.ch/wiki/project:oldcatholic_church_switzerland Old-catholic Church Switzerland 2017-10-12 10:11:00 2637 Downloading https://make.opendata.ch/wiki/project:swisssocialarchivesentitymatch Swiss Social Archives - Wikidata entity match 2017-09-16 17:49:00 280 Downloading https://make.opendata.ch/wiki/project:gutenberg_memory Hacking Gutenberg: A Moveable Type Game 2018-10-27 20:25:00 4010 Downloading https://make.opendata.ch/wiki/project:openguesser OpenGuesser 2017-09-17 00:06:00 1475 Downloading https://make.opendata.ch/wiki/project:wikidata_ontology_explorer Wikidata Ontology Explorer 2017-09-18 07:59:00 408 Retrieving event: Open Cultural Data Hackathon / https://make.opendata.ch/wiki/event:2016-07 Downloading https://make.opendata.ch/wiki/project:dodis_goes_hackathon Dodis goes Hackathon 2016-07-02 17:37:00 317 Downloading https://make.opendata.ch/wiki/project:historical_maps Historical Maps 2016-07-04 14:57:00 2942 Downloading https://make.opendata.ch/wiki/project:glamsearchportal GLAM RDF Search Portal 2016-07-07 11:31:00 3386 Downloading https://make.opendata.ch/wiki/project:glam_rdf_data_online Online SPARQL endpoint with RDF data 2016-07-22 16:44:00 1417 Downloading https://make.opendata.ch/wiki/project:sqlrdf_mapper SQL RDF MAPPER - converts SQL Data into RDF Data 2016-07-22 16:39:00 3022 Downloading https://make.opendata.ch/wiki/project:online_thesaurus_building_with_skos_shuttle Online Thesaurus Building with SKOS Shuttle 2016-07-07 11:32:00 456 Downloading https://make.opendata.ch/wiki/project:performing_arts_ontology Performing Arts Ontology 2019-02-03 21:16:00 1092 Downloading https://make.opendata.ch/wiki/project:linked_open_theatre_data Linked Open Theatre Data 2018-04-07 08:09:00 828 Downloading https://make.opendata.ch/wiki/project:visual_exploration_of_corporis_fabrica Visual exploration of corporis fabrica 2016-07-03 16:44:00 948 Downloading https://make.opendata.ch/wiki/project:manesse_gammon Manesse Gammon – A medieval game: Play against «Herrn Gœli» 2017-09-15 22:22:00 3443 Downloading https://make.opendata.ch/wiki/project:visualize_relationships Visualize Relationships in Authority Datasets 2016-07-03 18:24:00 1217 Downloading https://make.opendata.ch/wiki/project:sfadata_at_eexcess SFA-Metadata at EEXCESS 2016-07-02 14:40:00 1350 Downloading https://make.opendata.ch/wiki/project:kamusi Kamusi 2019-02-03 21:16:00 924 Downloading https://make.opendata.ch/wiki/project:hds_out_of_the_box HDS out of the box 2016-07-04 15:12:00 10626 Downloading https://make.opendata.ch/wiki/project:basicopendataideasforarchives Basic Open Data Idea for Archives 2018-11-14 06:37:00 1146 Downloading https://make.opendata.ch/wiki/project:kirchenarchive Kirchenarchive 2018-10-26 15:45:00 1785 Downloading https://make.opendata.ch/wiki/project:vsjfrefugees_migration vsjf refugees migration 2016-07-09 14:44:00 2067 Downloading https://make.opendata.ch/wiki/project:sprichort sprichort 2016-08-05 13:22:00 950 Downloading https://make.opendata.ch/wiki/project:dadabot Animation in the spirit of dada poetry 2016-07-03 10:43:00 1069 Downloading https://make.opendata.ch/wiki/project:glamhackclip2016 glamhackclip2016 2016-07-02 17:22:00 952 Downloading https://make.opendata.ch/wiki/project:swisslibrariesonamap Swiss Libraries on a map 2019-09-03 15:51:00 861 Retrieving event: Open Energy Data Hackdays / https://make.opendata.ch/wiki/event:2016-04 Downloading https://make.opendata.ch/wiki/project:atum Atum 2016-04-08 22:58:00 546 Downloading https://make.opendata.ch/wiki/project:ec Swiss Energy Dashboard 2016-04-28 10:24:00 1310 Downloading https://make.opendata.ch/wiki/project:energiestrategie_2050 Energiestrategie 2050 verstehen 2016-04-11 10:25:00 2429 Downloading https://make.opendata.ch/wiki/project:hotornot Hot or not for solar 2016-04-19 08:09:00 3625 Downloading https://make.opendata.ch/wiki/project:klumpenrisiko Klumpenrisiko 2016-05-01 11:37:00 4157 Downloading https://make.opendata.ch/wiki/project:lofos Lófos 2016-04-11 10:08:00 946 Downloading https://make.opendata.ch/wiki/project:periodic_aggregation_of_power_system_data open-power-system-timeseries 2016-04-08 18:47:00 1332 Downloading https://make.opendata.ch/wiki/project:swiss_energy_balance Swiss Energy Balance 2016-04-14 08:17:00 821 Retrieving event: Elections Hackdays / https://make.opendata.ch/wiki/event:2015-09 Downloading https://make.opendata.ch/wiki/project:chparlscraping CHParlScraping 2015-09-07 16:23:00 4022 Downloading https://make.opendata.ch/wiki/project:foreigners_vote Foreigners vote 2015-09-08 10:26:00 638 Downloading https://make.opendata.ch/wiki/project:interest_finder InterestFinder 2018-11-13 14:47:00 3671 Downloading https://make.opendata.ch/wiki/project:kandidaten KandiDaten 2015-09-10 00:49:00 2590 Downloading https://make.opendata.ch/wiki/project:komitees Komitees im Parlament 2015-09-05 15:25:00 382 Downloading https://make.opendata.ch/wiki/project:polegauge Political Sentiment Gauge 2015-09-05 17:16:00 1827 Downloading https://make.opendata.ch/wiki/project:politweets Politweets 2015-10-10 16:02:00 1598 Downloading https://make.opendata.ch/wiki/project:stimmenzaehler StimmenZähler 2015-09-05 15:02:00 1496 Downloading https://make.opendata.ch/wiki/project:the_twitter_parliament The Twitter Parliament 2015-09-05 14:39:00 1575 Downloading https://make.opendata.ch/wiki/project:voting_preferences_by_income Voting Preferences 2015-09-06 09:30:00 295 Downloading https://make.opendata.ch/wiki/project:was_waere_wenn_wahlen Was wäre wenn Wahlen 2015-09-08 11:19:00 993 Retrieving event: Open Research Hackdays / https://make.opendata.ch/wiki/event:2015-06 Downloading http://make.opendata.ch/wiki/project:openfooddna Open Food DNA 2015-06-06 16:30:00 3944 Downloading http://make.opendata.ch/wiki/project:opensnp openSNP (open genetics data) 2015-06-06 18:31:00 4702 Downloading http://make.opendata.ch/wiki/project:genestobiodiversity Genes to Biodiversity (Plazi) 2015-06-25 06:55:00 2552 Downloading http://make.opendata.ch/wiki/project:scatterwindrose Scattered Windrose 2017-07-10 18:23:00 952 Downloading http://make.opendata.ch/wiki/project:analyzecordis Analyze Cordis data (EU-funded research projects) 2015-06-09 16:29:00 3032 Downloading http://make.opendata.ch/wiki/project:swissprtr SwissPRTR dataset (Pollutant Release Register) 2015-06-05 16:23:00 695 Downloading http://make.opendata.ch/wiki/project:parliament_impact Parliament Impact Project 2015-06-08 07:43:00 1394 Downloading http://make.opendata.ch/wiki/project:discoverabilitythroughstructure Discoverability through Structure (Open Data Multisearch) 2019-02-03 21:17:00 4479 Downloading http://make.opendata.ch/wiki/project:tumblrverse tumblrverse 2018-11-14 06:38:00 1093 Retrieving event: Open Cultural Data Hackathon / https://make.opendata.ch/wiki/event:2015-02 Downloading https://make.opendata.ch/wiki/project:alain_und_laura_sind_neue_medien Alain und Laura sind neue Medien 2015-03-13 22:51:00 729 Downloading https://make.opendata.ch/wiki/project:ancestors Ancestors on Wikidata 2015-03-12 12:46:00 305 Downloading https://make.opendata.ch/wiki/project:artmap Artmap 2015-03-12 12:45:00 327 Downloading https://make.opendata.ch/wiki/project:catexport catexport 2015-03-12 12:55:00 388 Downloading https://make.opendata.ch/wiki/project:cultural_radio Cultural Music Radio 2016-07-04 13:14:00 1421 Downloading https://make.opendata.ch/wiki/project:diplomatic_documents_and_swiss_newspapers_in_1914 Diplomatic Documents and Swiss Newspapers in 1914 2015-03-01 15:46:00 2748 Downloading https://make.opendata.ch/wiki/project:flying_eduardo flying Eduardo 2015-06-06 10:09:00 768 Downloading https://make.opendata.ch/wiki/project:graphing_the_stateless Graphing the Stateless People in Carl Durheim's Photos 2015-03-12 13:02:00 1577 Downloading https://make.opendata.ch/wiki/project:historical_card_game Historical Tarot Freecell 2017-09-15 22:24:00 3073 Downloading https://make.opendata.ch/wiki/project:historical_views_of_zurich_data_upload Historical Views of Zurich Data Upload 2015-03-02 11:18:00 591 Downloading https://make.opendata.ch/wiki/project:lausanne_photo_quiz Lausanne Historic GeoGuesser 2015-03-01 14:45:00 380 Downloading https://make.opendata.ch/wiki/project:oldmaps Oldmaps online 2019-02-03 21:17:00 3127 Downloading https://make.opendata.ch/wiki/project:openglam_inventory OpenGLAM Inventory 2015-02-28 14:35:00 2155 Downloading https://make.opendata.ch/wiki/project:picturethis Picture This 2018-04-24 10:55:00 3754 Downloading https://make.opendata.ch/wiki/project:portrait_domain Portrait Domain 2016-07-27 10:48:00 1141 Downloading https://make.opendata.ch/wiki/project:publicdomaingame Public Domain Game 2015-05-05 08:38:00 380 Downloading https://make.opendata.ch/wiki/project:schweizer_kleinmeister:an_unexpected_journey Schweizer Kleinmeister: An Unexpected Journey 2015-05-29 21:23:00 2422 Downloading https://make.opendata.ch/wiki/project:spock_monroe_art_brut Spock Monroe Art Brut 2015-03-12 12:26:00 662 Downloading https://make.opendata.ch/wiki/project:swissgamesshowcase Swiss Games Showcase 2017-09-16 17:13:00 535 Downloading https://make.opendata.ch/wiki/project:the-endless-story The Endless Story 2018-11-13 14:59:00 341 Downloading https://make.opendata.ch/wiki/project:thematizer Thematizer 2018-11-24 11:48:00 3205 Downloading https://make.opendata.ch/wiki/project:monument_lists WikiProject "Cultural heritage" 2017-06-12 11:34:00 1442 Downloading https://make.opendata.ch/wiki/project:viisoo ViiSoo 2015-03-12 12:15:00 1007 Downloading https://make.opendata.ch/wiki/project:zuerich_1799 Zürich 1799 2015-03-12 12:20:00 452 Retrieving event: International Sports Hackdays / http://make.opendata.ch/wiki/event:2014-5 Downloading https://make.opendata.ch/wiki/project:blitzpoll Blitzpoll 2014-05-26 11:31:00 1798 Downloading https://make.opendata.ch/wiki/project:secondlamp SecondLamp 2014-05-24 16:47:00 876 Downloading https://make.opendata.ch/wiki/project:spoertle Spörtle 2014-05-25 21:33:00 1485 Downloading https://make.opendata.ch/wiki/project:matchquote MatchQuote 2014-05-25 21:39:00 2131 Downloading https://make.opendata.ch/wiki/project:tour_de_france Tour de France infoviz 2014-05-28 13:25:00 1574 Downloading https://make.opendata.ch/wiki/project:tour_de_france_history Tour de France API 2014-05-24 11:23:00 2381 Downloading https://make.opendata.ch/wiki/project:optisports Optisports 2014-05-24 14:57:00 1085 Downloading https://make.opendata.ch/wiki/project:linkedbisses Linked Open Data for Bisses 2014-05-24 15:43:00 3929 Downloading https://make.opendata.ch/wiki/project:beatit Beat It 2014-05-24 17:39:00 1457 Downloading https://make.opendata.ch/wiki/project:sportee Sportee 2014-05-24 16:34:00 1875 Downloading https://make.opendata.ch/wiki/project:tor_de_geants Tor de Geants visualization 2014-05-26 13:50:00 305 Downloading https://make.opendata.ch/wiki/project:digital_trainer_prototype Digital Trainer Prototype 2018-11-14 06:17:00 842 Retrieving event: "Zurich Open Data" Hacknights / https://make.opendata.ch/wiki/event:2013-10 Downloading http://make.opendata.ch/wiki/project:zwuermli Zwürmli 2013-11-14 09:52:00 2194 Downloading http://make.opendata.ch/wiki/project:pollenapp Kein Stress mit den Pollen 2013-11-13 18:07:00 1551 Downloading http://make.opendata.ch/wiki/project:biz Brancheninformationen Zürich 2013-11-13 00:37:00 3080 Downloading http://make.opendata.ch/wiki/project:familyfriendlyzurich Familienfreundliches Zürich 2013-10-30 19:53:00 1137 Downloading http://make.opendata.ch/wiki/project:crowdpee Crowdpee 2013-11-23 15:11:00 2370 Downloading http://make.opendata.ch/wiki/project:denkmalfuehrer Denkmalführer 2013-11-18 16:43:00 5435 Downloading http://make.opendata.ch/wiki/project:bevoelkerung Bevölkerungsbestände Stadt Zürich 2013-11-12 10:33:00 334 Downloading http://make.opendata.ch/wiki/project:zurichforlife Züri für das Leben 2013-11-17 11:04:00 1486 Downloading http://make.opendata.ch/wiki/project:otpzurich OpenTripPlanner for Zurich 2013-10-15 21:33:00 692 Downloading http://make.opendata.ch/wiki/project:zuerichmoodindex Zürich Mood Index 2013-11-12 00:32:00 370 Downloading http://make.opendata.ch/wiki/project:vonundzuzuerich Von und Zu Zürich 2013-11-12 00:30:00 379 Downloading http://make.opendata.ch/wiki/project:wasteland Wasteland 2018-01-27 21:10:00 408 Downloading http://make.opendata.ch/wiki/project:zuericrime ZüriCrime 2013-11-12 00:32:00 368 Downloading http://make.opendata.ch/wiki/project:induo Induo 2013-11-12 00:31:00 361 Downloading http://make.opendata.ch/wiki/project:einkommensklassen_der_gemeinden_des_kantons_zh Einkommensklassen Gemeinden Kt. ZH 2013-11-03 11:50:00 498 Retrieving event: Law Mining Hackathon at OKCon / https://make.opendata.ch/wiki/event:2013-09 Downloading https://make.opendata.ch/wiki/project:open_law_search Open Law Search 2013-09-21 18:55:00 1247 Downloading https://make.opendata.ch/wiki/project:legal:semantic_legal Semantic Legal 2013-09-19 14:07:00 2779 Downloading https://make.opendata.ch/wiki/project:legal:swiss_open_government_licence Swiss Open Government Licence 2013-09-19 16:42:00 1578 Downloading https://make.opendata.ch/wiki/project:legal:productliability Linked Product Liability Data 2018-11-13 15:34:00 1819 Downloading https://make.opendata.ch/wiki/project:legal:casesuccess Case Success 2018-11-13 14:47:00 2549 Downloading https://make.opendata.ch/wiki/project:legal:swiss_supremecourt Swiss Supreme Court 2013-09-25 08:22:00 1343 Downloading https://make.opendata.ch/wiki/project:legal:swiss_courts Swiss Courts 2013-09-12 00:33:00 1477 Downloading https://make.opendata.ch/wiki/project:legal:derivativeworks Derivative Works 2018-11-13 15:30:00 1405 Downloading https://make.opendata.ch/wiki/project:legal:openprivacypolicy Open Privacy Policy 2013-09-30 10:04:00 2351 Downloading https://make.opendata.ch/wiki/project:legal:opendatabutton Open This Data! 2018-11-13 15:30:00 5115 Downloading https://make.opendata.ch/wiki/project:legal:masstortplatform Mass Tort Litigation 2019-02-03 21:16:00 1636 Retrieving event: Open Finance Data / https://make.opendata.ch/wiki/event:2013-03 Downloading https://make.opendata.ch/wiki/project:taxfreedom Tax Freedom Day 2018-11-13 14:55:00 2011 Downloading https://make.opendata.ch/wiki/project:qualified_money Qualified Money 2018-11-13 14:39:00 3851 Downloading https://make.opendata.ch/wiki/project:openaid Open Aid 2017-09-25 11:39:00 1764 Downloading https://make.opendata.ch/wiki/project:open_budget Open Budget 2017-01-18 17:49:00 1140 Downloading https://make.opendata.ch/wiki/project:finanzausgleich_bern Finanzausgleich Kanton Bern 2013-08-15 23:23:00 1252 Downloading https://make.opendata.ch/wiki/project:ftth_map FTTH Business Model Navigator ? 167 Downloading https://make.opendata.ch/wiki/project:cumulizer Cumulizer 2019-02-03 21:16:00 6613 Downloading https://make.opendata.ch/wiki/project:cardgame Mobile Deck Card Game with Financial Data 2018-11-13 14:56:00 921 Downloading https://make.opendata.ch/wiki/project:semantic_legal Semantic Legal 2013-09-15 20:42:00 32 Downloading http://make.opendata.ch/wiki/project:cardgame Gemeinde-Quartett 2018-11-13 14:56:00 921 Downloading http://make.opendata.ch/wiki/project:taxfreedom Vis 4 Tax Freedom Day 2018-11-13 14:55:00 2011 Downloading http://make.opendata.ch/wiki/project:ftth_map FTTH Business Model Navigator ? 167 Downloading http://make.opendata.ch/wiki/project:opendeza http://make.opendata.ch/wiki/project:opendeza ? 167 Downloading https://make.opendata.ch/wiki/project:lotd Link the Swiss Open Tourism Data to the World Linked Open Data 2013-08-15 23:02:00 3873 Downloading https://make.opendata.ch/wiki/project:partisbudgets Partis Budgets 2013-08-15 22:58:00 1875 Downloading https://make.opendata.ch/wiki/project:crowdtagging Crowd Interest 2018-10-11 12:19:00 1577 Downloading https://make.opendata.ch/wiki/project:smartski Partons en piste ! 2013-08-15 22:59:00 2470 Downloading https://make.opendata.ch/wiki/project:holiday_apartments Holiday Apartments 2018-11-13 15:03:00 830 Downloading https://make.opendata.ch/wiki/project:calcul_du_pouvoir_d_achat_selon_le_salaire_et_le_cout_de_la_vie Mon pouvoir d'achat ? 167 Downloading https://make.opendata.ch/wiki/project:electronomy Electronomy 2013-08-15 22:23:00 6365 Retrieving event: Open Health Data / https://make.opendata.ch/wiki/event:2012-09 Downloading https://make.opendata.ch/wiki/project:born-died-ch born/died//ch 2013-08-15 22:52:00 1108 Downloading https://make.opendata.ch/wiki/project:compare-hospitals Compare Hospitals 2015-08-11 10:13:00 1735 Downloading https://make.opendata.ch/wiki/project:emergency_or_not Emergency or Not 2013-08-15 22:45:00 2618 Downloading https://make.opendata.ch/wiki/project:health_data_visualization Hospital Data Visualization 2013-08-15 22:42:00 984 Downloading https://make.opendata.ch/wiki/project:instacare InstaCare 2019-02-03 21:14:00 1722 Downloading https://make.opendata.ch/wiki/project:health:ipollution iPollution 2018-09-28 09:39:00 1863 Downloading https://make.opendata.ch/wiki/project:medicaltimeline Medical Timeline 2013-08-15 22:55:00 2775 Downloading https://make.opendata.ch/wiki/project:health:mygenerics myGenerics 2019-02-03 21:17:00 1170 Downloading https://make.opendata.ch/wiki/project:openmedsensor Open Med Sensor 2013-08-15 23:07:00 2927 Retrieving event: Open Data Camp / https://make.opendata.ch/wiki/event:2012-04 Downloading https://make.opendata.ch/wiki/project:bern_budget Bern Budget 2012 datavis 2017-09-25 11:32:00 443 Downloading https://make.opendata.ch/wiki/project:mysquartier Mys Quartier 2013-08-15 23:06:00 3278 Downloading https://make.opendata.ch/wiki/project:smartvote Smartvote Kandidaten 2012-04-30 00:00:00 262 Downloading https://make.opendata.ch/wiki/project:canihazswim Wassertemperaturen von Flussen 2012-05-07 00:00:00 493 Downloading https://make.opendata.ch/wiki/project:protectwildlife Wildruhezonen 2013-08-15 23:05:00 2008 Downloading https://make.opendata.ch/wiki/project:bake_open_data Bake Open Data 2018-11-13 15:03:00 1226 Downloading https://make.opendata.ch/wiki/project:wear_open_data Wear Open Data 2018-11-13 14:42:00 1502 Retrieving event: Space Open Data / https://make.opendata.ch/wiki/event:2012-04#lausanne Downloading https://make.opendata.ch/wiki/project:bern_budget Bern Budget 2012 datavis 2017-09-25 11:32:00 443 Downloading https://make.opendata.ch/wiki/project:mysquartier Mys Quartier 2013-08-15 23:06:00 3278 Downloading https://make.opendata.ch/wiki/project:smartvote Smartvote Kandidaten 2012-04-30 00:00:00 262 Downloading https://make.opendata.ch/wiki/project:canihazswim Wassertemperaturen von Flussen 2012-05-07 00:00:00 493 Downloading https://make.opendata.ch/wiki/project:protectwildlife Wildruhezonen 2013-08-15 23:05:00 2008 Downloading https://make.opendata.ch/wiki/project:bake_open_data Bake Open Data 2018-11-13 15:03:00 1226 Downloading https://make.opendata.ch/wiki/project:wear_open_data Wear Open Data 2018-11-13 14:42:00 1502 Retrieving event: Open Transport Data Camp / https://make.opendata.ch/wiki/event:2012-03 Downloading https://make.opendata.ch/wiki/project:transport:transportflows Transport Flows 2018-09-28 09:14:00 2079 Downloading https://make.opendata.ch/wiki/project:transport:tangiblestatistics Tangible Statistics 2013-08-15 23:15:00 1852 Downloading https://make.opendata.ch/wiki/project:transport:trainsharingapp TrainSharingApp 2018-10-11 11:27:00 1167 Downloading https://make.opendata.ch/wiki/project:transport:gottago GottaGo 2013-08-15 22:37:00 2078 Downloading https://make.opendata.ch/wiki/project:transport:swisschronograph Swisschronograph 2013-08-15 23:18:00 293 Downloading https://make.opendata.ch/wiki/project:data2open Data2Open 2013-08-15 23:14:00 3247 Downloading https://make.opendata.ch/wiki/project:geneve-velo Vélo Genève 2013-08-15 23:16:00 651 Downloading https://make.opendata.ch/wiki/project:pmropenfixmap PMR-openfixmap 2019-02-03 21:16:00 3435 Downloading https://make.opendata.ch/wiki/project:mobility:gsm-towers Where's My Tower 2018-11-14 06:11:00 1804 Downloading https://make.opendata.ch/wiki/project:sieste Sieste 2019-02-03 21:16:00 1149 Downloading https://make.opendata.ch/wiki/project:sitg2osm SITG 2 OSM 2013-08-15 23:15:00 618 Downloading https://make.opendata.ch/wiki/project:jsignage-transport Public Transport jSignage 2013-08-15 23:14:00 732 Downloading https://make.opendata.ch/wiki/project:transport_api Transport API 2018-11-14 06:06:00 2072 Retrieving event: Open Data Camp / https://make.opendata.ch/wiki/event:2011-09 Downloading https://make.opendata.ch/wiki/project:swiss_army_contaminated_sites swiss_army_contaminated_sites 2013-08-15 23:23:00 1457 Downloading https://make.opendata.ch/wiki/project:makeopendata makeopendata 2018-11-13 15:31:00 917 Downloading https://make.opendata.ch/wiki/project:openpolitics openpolitics 2013-08-15 23:13:00 1513 Downloading https://make.opendata.ch/wiki/project:openletten openletten 2017-09-22 18:37:00 4392 Downloading https://make.opendata.ch/wiki/project:student_migration student_migration 2013-08-15 23:12:00 789 Downloading https://make.opendata.ch/wiki/project:wheredidmytaxesgo wheredidmytaxesgo 2018-11-13 14:55:00 767 Downloading https://make.opendata.ch/wiki/project:swiss_open_government_licence swiss_open_government_licence 2013-09-15 20:41:00 47 Downloading https://make.opendata.ch/wiki/project:mobility mobility 2017-09-25 11:48:00 2562 Downloading https://make.opendata.ch/wiki/project:consommation_energie_lausanne consommation_energie_lausanne 2013-11-28 16:01:00 1506 Downloading https://make.opendata.ch/wiki/project:criminalite_et_sentiment_d_insecurite criminalite_et_sentiment_d_insecurite 2013-08-15 23:10:00 919 Downloading https://make.opendata.ch/wiki/project:swiss_associations_directory swiss_associations_directory 2013-08-15 23:08:00 1200 Downloading https://make.opendata.ch/wiki/project:swissmap swissmap 2018-01-11 17:23:00 652 Downloading https://make.opendata.ch/wiki/project:parlament parlament 2016-08-23 14:10:00 1447 Downloading https://make.opendata.ch/wiki/project:green_street green_street 2013-08-15 23:06:00 1445 Downloading https://make.opendata.ch/wiki/project:health_on_duty health_on_duty 2013-08-15 23:07:00 959 Downloading https://make.opendata.ch/wiki/project:ch_euro_geigermaps ch_euro_geigermaps 2013-08-15 23:06:00 2697

Data

Downloaded 4 times by loleg

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (1.8 MB) Use the API

rows 10 / 227

event event_url updated title url text html
GLAM mix'n'hack 2019
2020-02-06 14:44:00
Back to the Greek Universe
Back to the Greek Universe Back to the Greek Universe is a web application that allows users to explore the ancient Greek model of the universe in virtual reality so that they can realize what detailed knowledge the Greeks had of the movement of the celestial bodies observable from the Earth's surface. The model is based on Claudius Ptolemy's work, which is characterized by the fact that it adopts a geo-centric view of the universe with the earth in the center. Ptolemy placed the planets in the following order: Moon Mercury Venus Sun Mars Jupiter Saturn Fixed stars The movements of the celestial bodies as they appear to earthlings are expressed as a series of superposed circular movements (see deferent and epicycle theory), characterized by varying radius and speed. The tabular values that serve as inputs to the model have been extracted from literature. Demo Video Claudius Ptolemy (~100-160 AD) was a Greek scientist working at the library of Alexandria. One of his most important works, the «Almagest», sums up the geographic, mathematical and astronomical knowledge of the time. It is the first outline of a coherent system of the universe in the history of mankind. Back to the Greek Universe is a VR model that rebuilds Ptolemy’s system of the universe on a scale of 1/1 billion. The planets are 100 times larger, the earth rotates 100 times more slowly. The planet orbits periods are 1 million times faster than they would be according to Ptolemy’s calculations. Back to the Greek Universe was coded and presented at the Swiss Open Cultural Data Hackathon/mix'n'hack 2019 in Sion, Switzerland, from Sept 6-8, 2019, by Thomas Weibel, Cédric Sievi, Pia Viviani and Beat Estermann. Instructions This is how to fly Ptolemy's virtual spaceship: Point your smartphone camera towards the QR code, tap on the popup banner in order to launch into space. Turn around and discover the ancient greek solar system. Follow the planets' epicyclic movements (see above). Tap in order to travel through space, in any direction you like. Every single tap will teleport you roughly 18 million miles forward. Back home: Point your device vertically down and tap in order to teleport back to earth. Gods' view: Point your device vertically up and tap in order to overlook Ptolemy’s system of the universe from high above. The cockpit on top is a time and distances display: The years and months indicator gives you an idea of how rapidly time goes by in the simulation, the miles indicator will always display your current distance from the earth center (in million nautical miles). Data The data used include 16th century prints of Ptolemy's main work, the Almagest (both in greek and latin) and high-resolution surface photos of the planets in Mercator projection. The photos are mapped onto rotating spheres by means of Mozilla's web VR framework A-Frame. Earth map (public domain) Moon map (public domain) Mercury map (public domain) Venus map (public domain) Sun map (public domain) Mars map (public domain) Jupiter map (public domain) Saturn map (public domain) Stars map (milky way) (Creative Commons Attribution 4.0 International) Primary literature Simon Grynaeus: Kl. Ptolemaiou Megalēs syntaxeōs bibl. 13, public domain Peter Liechtenstein: Almagestum CL. Ptolemei Pheludiensis Alexandrini astronomorum principis opus ingens ac nobile omnes celoru motus continens, public domain Secondary literature Richard Fitzpatrick: A Modern Almagest, An Updated Version of Ptolemy’s Model of the Solar System John Cramer: The Ptolemaic System, A Detailed Synopsis Astrophysikalisches Institut Neunhof: Das Weltmodell des Ptolemaios Version history 2019/09/07 v1.0: Basic VR engine, interactive prototype 2019/09/08 v1.01: Cockpit with time and distance indicator 2019/09/13 v1.02: Space flight limited to stars sphere, minor bugfixes 2019/09/17 v1.03: Planet ecliptics adjusted Media Back to the Greek Universe Video (mp4), public domain Team Thomas Weibel (weibelth) Cédric Sievi Pia Viviani (pia) Beat Estermann (beat_estermann) concept, dev, design, glam
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="back_to_the_greek_universe">Back to the Greek Universe</h2> <div class="level2"> <p> <a class="media" href="https://www.thomasweibel.ch/back-to-the-greek-universe/" rel="nofollow" title="https://www.thomasweibel.ch/back-to-the-greek-universe/"><img alt="Simulation of the Ptolemaic system of the universe" class="media img-responsive" src="/wiki/_media/project:greek-universe.jpg" title="Simulation of the Ptolemaic system of the universe"/></a> </p> <p> <a class="urlextern" href="https://www.thomasweibel.ch/back-to-the-greek-universe/" rel="nofollow" title="https://www.thomasweibel.ch/back-to-the-greek-universe/">Back to the Greek Universe</a> is a web application that allows users to explore the ancient Greek model of the universe in virtual reality so that they can realize what detailed knowledge the Greeks had of the movement of the celestial bodies observable from the Earth's surface. The model is based on Claudius Ptolemy's work, which is characterized by the fact that it adopts a geo-centric view of the universe with the earth in the center. </p> <p> Ptolemy placed the planets in the following order: </p> <ol class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Moon</div> </li> <li class="level1"><div class="li"> Mercury</div> </li> <li class="level1"><div class="li"> Venus</div> </li> <li class="level1"><div class="li"> Sun</div> </li> <li class="level1"><div class="li"> Mars</div> </li> <li class="level1"><div class="li"> Jupiter</div> </li> <li class="level1"><div class="li"> Saturn</div> </li> <li class="level1"><div class="li"> Fixed stars</div> </li> </ol> <p> <a class="media" href="/wiki/_detail/project:universum.png?id=project%3Aback_to_the_greek_universe" title="project:universum.png"><img alt="Renaissance woodcut illustrating the Ptolemaic sphere model" class="mediaright img-responsive" height="328" src="/wiki/_media/project:universum.png?w=319&h=328&tok=8836b0" title="Renaissance woodcut illustrating the Ptolemaic sphere model" width="319"/></a>The movements of the celestial bodies as they appear to earthlings are expressed as a series of superposed circular movements (see <a class="urlextern" href="https://en.wikipedia.org/wiki/Deferent_and_epicycle" rel="nofollow" title="https://en.wikipedia.org/wiki/Deferent_and_epicycle">deferent and epicycle</a> theory), characterized by varying radius and speed. The tabular values that serve as inputs to the model have been extracted from literature. </p> <p> <a class="urlextern" href="https://www.youtube.com/watch?v=fm-YscWz1Xc" rel="nofollow" title="https://www.youtube.com/watch?v=fm-YscWz1Xc">Demo Video</a> </p> <p> Claudius Ptolemy (~100-160 AD) was a Greek scientist working at the library of Alexandria. One of his most important works, the Almagest, sums up the geographic, mathematical and astronomical knowledge of the time. It is the first outline of a coherent system of the universe in the history of mankind. </p> <p> Back to the Greek Universe is a VR model that rebuilds Ptolemys system of the universe on a scale of 1/1 billion. The planets are 100 times larger, the earth rotates 100 times more slowly. The planet orbits periods are 1 million times faster than they would be according to Ptolemys calculations. </p> <p> Back to the Greek Universe was coded and presented at the Swiss Open Cultural Data Hackathon/mix'n'hack 2019 in Sion, Switzerland, from Sept 6-8, 2019, by Thomas Weibel, Cdric Sievi, Pia Viviani and Beat Estermann. </p> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="instructions">Instructions</h2> <div class="level2"> <p> <img alt="" class="medialeft img-responsive" src="/wiki/_media/project:qrcode.png?w=250&tok=ab8362" width="250"/>This is how to fly Ptolemy's virtual spaceship: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Point your smartphone camera towards the QR code, tap on the popup banner in order to launch into space.</div> </li> <li class="level1"><div class="li"> Turn around and discover the ancient greek solar system. Follow the planets' epicyclic movements (see above).</div> </li> <li class="level1"><div class="li"> Tap in order to travel through space, in any direction you like. Every single tap will teleport you roughly 18 million miles forward.</div> </li> <li class="level1"><div class="li"> <strong>Back home:</strong> Point your device vertically down and tap in order to teleport back to earth.</div> </li> <li class="level1"><div class="li"> <strong>Gods' view:</strong> Point your device vertically up and tap in order to overlook Ptolemys system of the universe from high above.</div> </li> </ul> <p> The cockpit on top is a time and distances display: The years and months indicator gives you an idea of how rapidly time goes by in the simulation, the miles indicator will always display your current distance from the earth center (in million nautical miles). </p> </div> <h2 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <p> The data used include 16th century prints of Ptolemy's main work, the <a class="urlextern" href="https://en.wikipedia.org/wiki/Almagest" rel="nofollow" title="https://en.wikipedia.org/wiki/Almagest">Almagest</a> (both in greek and latin) and high-resolution surface photos of the planets in Mercator projection. The photos are mapped onto rotating spheres by means of Mozilla's web VR framework <a class="urlextern" href="https://aframe.io/" rel="nofollow" title="https://aframe.io/">A-Frame</a>. </p> <p> <a class="media" href="https://commons.wikimedia.org/wiki/File:Whole_world_-_land_and_oceans_12000.jpg" rel="nofollow" title="https://commons.wikimedia.org/wiki/File:Whole_world_-_land_and_oceans_12000.jpg"><img alt="Earth" class="media img-responsive" height="200" src="/wiki/_media/project:earth-small.jpg?w=400&h=200&tok=68f1c1" title="Earth" width="400"/></a> Earth map (public domain) </p> <p> <a class="media" href="https://commons.wikipedia.org/wiki/File:Phobos_Viking_Mosaic_DLRcontrol_7200.jpg" rel="nofollow" title="https://commons.wikipedia.org/wiki/File:Phobos_Viking_Mosaic_DLRcontrol_7200.jpg"><img alt="Moon" class="media img-responsive" height="200" src="/wiki/_media/project:moon-small.jpg?w=400&h=200&tok=5ba1bc" title="Moon" width="400"/></a> Moon map (public domain) </p> <p> <a class="media" href="https://commons.wikimedia.org/wiki/File:Mercury_global_map_2013-05-14_bright.png" rel="nofollow" title="https://commons.wikimedia.org/wiki/File:Mercury_global_map_2013-05-14_bright.png"><img alt="Mercury" class="media img-responsive" height="200" src="/wiki/_media/project:mercury-small.jpg?w=400&h=200&tok=d2be2a" title="Mercury" width="400"/></a> Mercury map (public domain) </p> <p> <a class="media" href="https://commons.wikimedia.org/wiki/File:Venus_map_NASA_JPL_Magellan-Venera-Pioneer.jpg" rel="nofollow" title="https://commons.wikimedia.org/wiki/File:Venus_map_NASA_JPL_Magellan-Venera-Pioneer.jpg"><img alt="Venus" class="media img-responsive" height="200" src="/wiki/_media/project:venus-small.jpg?w=400&h=200&tok=e7bcd9" title="Venus" width="400"/></a> Venus map (public domain) </p> <p> <a class="media" href="https://commons.wikimedia.org/wiki/File:Map_of_the_full_sun.jpg" rel="nofollow" title="https://commons.wikimedia.org/wiki/File:Map_of_the_full_sun.jpg"><img alt="Sun" class="media img-responsive" height="200" src="/wiki/_media/project:sun-small.jpg?w=400&h=200&tok=44b73c" title="Sun" width="400"/></a> Sun map (public domain) </p> <p> <a class="media" href="https://commons.wikimedia.org/wiki/File:MGS_MOC_Wide_Angle_Atlas.png" rel="nofollow" title="https://commons.wikimedia.org/wiki/File:MGS_MOC_Wide_Angle_Atlas.png"><img alt="Mars" class="media img-responsive" height="200" src="/wiki/_media/project:mars-small.jpg?w=400&h=200&tok=e65f67" title="Mars" width="400"/></a> Mars map (public domain) </p> <p> <a class="media" href="https://commons.wikimedia.org/wiki/File:Jupiter_Cylindrical_Map_-_Dec_2000_PIA07782.jpg" rel="nofollow" title="https://commons.wikimedia.org/wiki/File:Jupiter_Cylindrical_Map_-_Dec_2000_PIA07782.jpg"><img alt="Jupiter" class="media img-responsive" height="200" src="/wiki/_media/project:jupiter-small.jpg?w=400&h=200&tok=5577d0" title="Jupiter" width="400"/></a> Jupiter map (public domain) </p> <p> <a class="media" href="https://photojournal.jpl.nasa.gov/catalog/PIA18437" rel="nofollow" title="https://photojournal.jpl.nasa.gov/catalog/PIA18437"><img alt="Saturn" class="media img-responsive" height="200" src="/wiki/_media/project:saturn-small.jpg?w=400&h=200&tok=3cc7d5" title="Saturn" width="400"/></a> Saturn map (public domain) </p> <p> <a class="media" href="https://commons.wikipedia.org/wiki/File:ESO_-_The_Milky_Way_panorama_%28by%29.jpg" rel="nofollow" title="https://commons.wikipedia.org/wiki/File:ESO_-_The_Milky_Way_panorama_%28by%29.jpg"><img alt="" class="media img-responsive" height="200" src="/wiki/_media/project:stars-small.jpg?w=400&h=200&tok=2378e3" width="400"/></a> Stars map (milky way) (<a class="urlextern" href="https://en.wikipedia.org/wiki/en:Creative_Commons" rel="nofollow" title="https://en.wikipedia.org/wiki/en:Creative_Commons">Creative Commons Attribution 4.0 International</a>) </p> </div> <h2 class="sectionedit4 page-header pb-3 mb-4 mt-5" id="primary_literature">Primary literature</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Simon Grynaeus: <a class="urlextern" href="https://www.e-rara.ch/bau_1/content/titleinfo/4235133" rel="nofollow" title="https://www.e-rara.ch/bau_1/content/titleinfo/4235133">Kl. Ptolemaiou Megals syntaxes bibl. 13</a>, public domain</div> </li> <li class="level1"><div class="li"> Peter Liechtenstein: <a class="urlextern" href="http://farside.ph.utexas.edu/Books/Syntaxis/Almagest.pdf" rel="nofollow" title="http://farside.ph.utexas.edu/Books/Syntaxis/Almagest.pdf">Almagestum CL. Ptolemei Pheludiensis Alexandrini astronomorum principis opus ingens ac nobile omnes celoru motus continens</a>, public domain</div> </li> </ul> </div> <h2 class="sectionedit5 page-header pb-3 mb-4 mt-5" id="secondary_literature">Secondary literature</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Richard Fitzpatrick: <a class="urlextern" href="http://farside.ph.utexas.edu/Books/Syntaxis/Almagest.pdf" rel="nofollow" title="http://farside.ph.utexas.edu/Books/Syntaxis/Almagest.pdf">A Modern Almagest, An Updated Version of Ptolemys Model of the Solar System</a></div> </li> <li class="level1"><div class="li"> John Cramer: <a class="urlextern" href="https://digitalcommons.kennesaw.edu/ojur/vol5/iss1/3" rel="nofollow" title="https://digitalcommons.kennesaw.edu/ojur/vol5/iss1/3">The Ptolemaic System, A Detailed Synopsis</a></div> </li> <li class="level1"><div class="li"> Astrophysikalisches Institut Neunhof: <a class="urlextern" href="https://www.astrophys-neunhof.de/mtlg/sd10013.pdf" rel="nofollow" title="https://www.astrophys-neunhof.de/mtlg/sd10013.pdf">Das Weltmodell des Ptolemaios</a></div> </li> </ul> </div> <h2 class="sectionedit6 page-header pb-3 mb-4 mt-5" id="version_history">Version history</h2> <div class="level2"> <p> 2019/09/07 v1.0: Basic VR engine, interactive prototype<br> 2019/09/08 v1.01: Cockpit with time and distance indicator<br> 2019/09/13 v1.02: Space flight limited to stars sphere, minor bugfixes<br> 2019/09/17 v1.03: Planet ecliptics adjusted </br></br></br></p> </div> <h2 class="sectionedit7 page-header pb-3 mb-4 mt-5" id="media">Media</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Back to the Greek Universe <a class="urlextern" href="https://make.opendata.ch/wiki/_media/project:greekuniverse-final2.mp4" rel="nofollow" title="https://make.opendata.ch/wiki/_media/project:greekuniverse-final2.mp4">Video (mp4)</a>, public domain</div> </li> </ul> </div> <h2 class="sectionedit8 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Thomas Weibel (<a class="wikilink2" href="/wiki/user:weibelth" rel="nofollow" title="user:weibelth">weibelth</a>)</div> </li> <li class="level1"><div class="li"> Cdric Sievi</div> </li> <li class="level1"><div class="li"> Pia Viviani (<a class="wikilink2" href="/wiki/user:pia" rel="nofollow" title="user:pia">pia</a>)</div> </li> <li class="level1"><div class="li"> Beat Estermann (<a class="wikilink1" href="/wiki/user:beat_estermann" title="user:beat_estermann">beat_estermann</a>)</div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:glam?do=showtag&tag=glam" rel="tag" title="tag:glam"><span class="iconify" data-icon="mdi:tag-text-outline"></span> glam</a> </span></div> </div>
GLAM mix'n'hack 2019
2019-09-08 15:17:00
CoViMAS
CoViMAS Collaborative Virtual Museum for All Senses (CoViMAS) is an extended virtual museum which engages all senses of visitors. It is a substantial upgrade and expansion of our award-winning Glamhack 2018 project “Walking around the Globe” http://make.opendata.ch/wiki/project:virtual_3d_exhibition which had the DBIS Group from the University of Basel team up with the ETH Library to introduce a prototype of an exhibition in Virtual Reality. CoViMAS aims to provide a collaborative environment for multiple visitors in the virtual museum. This feature allows them to have a shared experience through different virtual reality devices. Additionally, CoViMAS enriches the user experience in virtual space by providing physical objects which can be manipulated by the user in virtual space. Thanks to the mix'n'hack organizers and FabLab (https://fablab-sion.ch/), the user will be able to touch postcards, view them closely, and feel their texture. To add the modern touch to the older pictures in the provided data, we add colorized images alongside the existing ones, to have a more lively look into the past using the pictures in the Virtual Museum. Project Timeline Day One CoViMAS joins forces of different disciplines to form a group which contains Maker, content provider, developer(s), communicator, designer and user experience expert. Having different backgrounds and expertise made a great opportunity to explore different ideas and opportunities to develop the horizons of the project. Two vital components of this project is Virtual Reality headset and Datasets which are going to be used. HTC Vive Pro VR headsets are converted to wireless mode after our last experiment which prove the freedom of movement without wires attached to the user, increases the user experience and feasibility of usage. Our team content provider and designer spent invaluable amount of time to search for the representative postcards and audio which can be integrated in the virtual space and have the potential to improve the virtual reality experience by adding extra senses. This includes selecting postcards which can be touched and seen in virtual and non-virtual reality. Additionally, to improve the experience, and idea of hearing a sound which is related to the picture being viewed popped up. This audio should have a correlation with the picture being viewed and recreate the sensation of the picture environment for the user in virtual world. To integrate the modern methods of Image manipulation through artificial Intelligence, we tried using Deep Learning method to colorize the gray-scale images of the “otografien aus dem Wallis von Charles Rieder”. The colored images allow the visitor get a more sensible feeling of the pictures he/she is viewing. The initial implementation of the algorithm showed the challenges we face, for example the faded parts of the pictures or scratched images could not very well be colorized. Day Two Although the VR exhibition is taken from our previous participation in Glamhack2018, the exhibition needed to be adjusted to the new content. We have designed the rooms to showcase the dataset “Postkarten aus dem Wallis (1890-1950)”. at this point, the selected postcards to be enriched with additional senses are sent to the Fablab, to create a haptic card and a feather pallet which is needed to be used alongside one postcard which represent a goose. the fabricated elements of our exhibition get attached to a tracker which can be seen through VR glasses and this allows the user to be aware of location of the object, and sense it. The Colorization improved through the day, by some alteration in training setup and the parameters used to tune the images. The results at this stage are relatively good. And the VR exhibition hall is adjusted to be automatically load images from postcards and also the colorized images alongside their original color. And late night, when finalizing the works for the next day, most of our stickers have changed status from “Implementation” phase to “Done” Phase! Day Three CoViMAS is getting to the final stage in the last day. The Room design is finished and the location of the images on the wall are determined. The tracker location is updated in the VR environment to represent the real location of the object. With this improvement the postcard can be touched while being viewed simultaneously. Data Fotografien aus dem Wallis von Charles Rieder https://opendata.swiss/dataset/photographs-of-valais-by-charles-rieder Postkarten aus dem Wallis (1890-1950) https://opendata.swiss/dataset/postcards-from-valais-1890-1950 Team Mahnaz Amiri Parian, PhD Student @ Databases and Information Systems Group Silvan Heller, PhD Student @ Databases and Information Systems Group Florian Spiess, MsC @ Computer Science Uni Basel Fabian Stef Florence concept, dev, design, data, expert, glam
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="covimas">CoViMAS</h2> <div class="level2"> <p> <video class="mediaright" controls="controls" height="240" title=" " width="320"> <source src="/wiki/_media/project:covimas.mp4" type="video/mp4"/> <a class="media mediafile mf_mp4" href="/wiki/_media/project:covimas.mp4?cache=" title="project:covimas.mp4 (133.3MB)"> </a></video> Collaborative Virtual Museum for All Senses (CoViMAS) is an extended virtual museum which engages all senses of visitors. It is a substantial upgrade and expansion of our award-winning Glamhack 2018 project Walking around the Globe <a class="urlextern" href="http://make.opendata.ch/wiki/project:virtual_3d_exhibition" rel="nofollow" title="http://make.opendata.ch/wiki/project:virtual_3d_exhibition">http://make.opendata.ch/wiki/project:virtual_3d_exhibition</a> which had the DBIS Group from the University of Basel team up with the ETH Library to introduce a prototype of an exhibition in Virtual Reality. </p> <p> CoViMAS aims to provide a collaborative environment for multiple visitors in the virtual museum. This feature allows them to have a shared experience through different virtual reality devices. </p> <p> Additionally, CoViMAS enriches the user experience in virtual space by providing physical objects which can be manipulated by the user in virtual space. Thanks to the mix'n'hack organizers and FabLab (<a class="urlextern" href="https://fablab-sion.ch/" rel="nofollow" title="https://fablab-sion.ch/">https://fablab-sion.ch/</a>), the user will be able to touch postcards, view them closely, and feel their texture. </p> <p> To add the modern touch to the older pictures in the provided data, we add colorized images alongside the existing ones, to have a more lively look into the past using the pictures in the Virtual Museum. </p> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="project_timeline">Project Timeline</h2> <div class="level2"> </div> <h3 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="day_one">Day One</h3> <div class="level3"> <p> CoViMAS joins forces of different disciplines to form a group which contains Maker, content provider, developer(s), communicator, designer and user experience expert. Having different backgrounds and expertise made a great opportunity to explore different ideas and opportunities to develop the horizons of the project. </p> <p> Two vital components of this project is Virtual Reality headset and Datasets which are going to be used. HTC Vive Pro VR headsets are converted to wireless mode after our last experiment which prove the freedom of movement without wires attached to the user, increases the user experience and feasibility of usage. </p> <p> Our team content provider and designer spent invaluable amount of time to search for the representative postcards and audio which can be integrated in the virtual space and have the potential to improve the virtual reality experience by adding extra senses. This includes selecting postcards which can be touched and seen in virtual and non-virtual reality. Additionally, to improve the experience, and idea of hearing a sound which is related to the picture being viewed popped up. This audio should have a correlation with the picture being viewed and recreate the sensation of the picture environment for the user in virtual world. </p> <p> To integrate the modern methods of Image manipulation through artificial Intelligence, we tried using Deep Learning method to colorize the gray-scale images of the otografien aus dem Wallis von Charles Rieder. The colored images allow the visitor get a more sensible feeling of the pictures he/she is viewing. The initial implementation of the algorithm showed the challenges we face, for example the faded parts of the pictures or scratched images could not very well be colorized. </p> <p> <a class="media" href="/wiki/_detail/project:img_20190908_112033_1_.jpg?id=project%3Acovimas" title="project:img_20190908_112033_1_.jpg"><img alt="img_20190908_112033_1_.jpg" class="media img-responsive" src="/wiki/_media/project:img_20190908_112033_1_.jpg?w=500&tok=c0350b" title="img_20190908_112033_1_.jpg" width="500"/></a> <a class="media" href="/wiki/_detail/project:test.png?id=project%3Acovimas" title="project:test.png"><img alt="" class="medialeft img-responsive" src="/wiki/_media/project:test.png?w=376&tok=fb53d7" width="376"/></a> </p> </div> <h3 class="sectionedit4 page-header pb-3 mb-4 mt-5" id="day_two">Day Two</h3> <div class="level3"> <p> Although the VR exhibition is taken from our previous participation in Glamhack2018, the exhibition needed to be adjusted to the new content. We have designed the rooms to showcase the dataset Postkarten aus dem Wallis (1890-1950). at this point, the selected postcards to be enriched with additional senses are sent to the Fablab, to create a haptic card and a feather pallet which is needed to be used alongside one postcard which represent a goose. </p> <p> <a class="media" href="/wiki/_detail/project:img_20190907_174307.jpg?id=project%3Acovimas" title="project:img_20190907_174307.jpg"><img alt="" class="medialeft img-responsive" src="/wiki/_media/project:img_20190907_174307.jpg?w=280&tok=071b99" width="280"/></a> <a class="media" href="/wiki/_detail/project:img_20190907_174005.jpg?id=project%3Acovimas" title="project:img_20190907_174005.jpg"><img alt="" class="media img-responsive" src="/wiki/_media/project:img_20190907_174005.jpg?w=500&tok=8378a3" width="500"/></a> </p> <p> the fabricated elements of our exhibition get attached to a tracker which can be seen through VR glasses and this allows the user to be aware of location of the object, and sense it. </p> <p> The Colorization improved through the day, by some alteration in training setup and the parameters used to tune the images. The results at this stage are relatively good. </p> <p> <a class="media" href="/wiki/_detail/project:111.png?id=project%3Acovimas" title="project:111.png"><img alt="" class="media img-responsive" src="/wiki/_media/project:111.png?w=300&tok=9b2747" width="300"/></a><a class="media" href="/wiki/_detail/project:2.png?id=project%3Acovimas" title="project:2.png"><img alt="" class="media img-responsive" src="/wiki/_media/project:2.png?w=300&tok=356cb7" width="300"/></a><a class="media" href="/wiki/_detail/project:055ph-00073.jpg_out.png?id=project%3Acovimas" title="project:055ph-00073.jpg_out.png"><img alt="" class="media img-responsive" src="/wiki/_media/project:055ph-00073.jpg_out.png?w=300&tok=66155f" width="300"/></a> </p> <p> And the VR exhibition hall is adjusted to be automatically load images from postcards and also the colorized images alongside their original color. </p> <p> And late night, when finalizing the works for the next day, most of our stickers have changed status from Implementation phase to Done Phase! </p> <p> <a class="media" href="/wiki/_detail/project:img_20190907_201637.jpg?id=project%3Acovimas" title="project:img_20190907_201637.jpg"><img alt="" class="mediacenter img-responsive" src="/wiki/_media/project:img_20190907_201637.jpg?w=500&tok=157a99" width="500"/></a> </p> </div> <h3 class="sectionedit5 page-header pb-3 mb-4 mt-5" id="day_three">Day Three</h3> <div class="level3"> <p> CoViMAS is getting to the final stage in the last day. The Room design is finished and the location of the images on the wall are determined. The tracker location is updated in the VR environment to represent the real location of the object. With this improvement the postcard can be touched while being viewed simultaneously. </p> <p> <a class="media" href="/wiki/_detail/project:ce0da5f3-16e6-411c-aee1-86aba5827ee3.jpeg?id=project%3Acovimas" title="project:ce0da5f3-16e6-411c-aee1-86aba5827ee3.jpeg"><img alt="" class="media img-responsive" src="/wiki/_media/project:ce0da5f3-16e6-411c-aee1-86aba5827ee3.jpeg?w=500&tok=003923" width="500"/></a> <a class="media" href="/wiki/_detail/project:feather.jpeg?id=project%3Acovimas" title="project:feather.jpeg"><img alt="" class="media img-responsive" src="/wiki/_media/project:feather.jpeg?w=360&tok=148270" width="360"/></a> </p> </div> <h2 class="sectionedit6 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Fotografien aus dem Wallis von Charles Rieder <a class="urlextern" href="https://opendata.swiss/dataset/photographs-of-valais-by-charles-rieder" rel="nofollow" title="https://opendata.swiss/dataset/photographs-of-valais-by-charles-rieder">https://opendata.swiss/dataset/photographs-of-valais-by-charles-rieder</a></div> </li> <li class="level1"><div class="li"> Postkarten aus dem Wallis (1890-1950) <a class="urlextern" href="https://opendata.swiss/dataset/postcards-from-valais-1890-1950" rel="nofollow" title="https://opendata.swiss/dataset/postcards-from-valais-1890-1950">https://opendata.swiss/dataset/postcards-from-valais-1890-1950</a></div> </li> </ul> </div> <h2 class="sectionedit7 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Mahnaz Amiri Parian, PhD Student @ <a class="urlextern" href="https://dbis.dmi.unibas.ch" rel="nofollow" title="https://dbis.dmi.unibas.ch">Databases and Information Systems Group</a></div> </li> <li class="level1"><div class="li"> Silvan Heller, PhD Student @ <a class="urlextern" href="https://dbis.dmi.unibas.ch" rel="nofollow" title="https://dbis.dmi.unibas.ch">Databases and Information Systems Group</a></div> </li> <li class="level1"><div class="li"> Florian Spiess, MsC @ <a class="urlextern" href="https://dmi.unibas.ch/de/forschung/informatik/" rel="nofollow" title="https://dmi.unibas.ch/de/forschung/informatik/">Computer Science Uni Basel</a></div> </li> <li class="level1"><div class="li"> Fabian </div> </li> <li class="level1"><div class="li"> Stef</div> </li> <li class="level1"><div class="li"> Florence</div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:data?do=showtag&tag=needs%3Adata" rel="tag" title="needs:data"><span class="iconify" data-icon="mdi:tag-text-outline"></span> data</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:expert?do=showtag&tag=needs%3Aexpert" rel="tag" title="needs:expert"><span class="iconify" data-icon="mdi:tag-text-outline"></span> expert</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:glam?do=showtag&tag=glam" rel="tag" title="tag:glam"><span class="iconify" data-icon="mdi:tag-text-outline"></span> glam</a> </span></div> </div>
GLAM mix'n'hack 2019
2019-10-31 17:38:00
Opera Forever
Opera Forever Opera Forever is an online collaboration platform and social networking site to collectively explore large amounts of opera recordings. The platform allows users to tag audio sequences with various types of semantics, such as personal preference, emotional reaction, specific musical features, technical issues, etc. Through the analysis of personal preference and/or emotional reaction to specific audio sequences, a characterization of personal listening tastes will be possible, and people with similar (or very dissimilar) tastes can be matched. The platform will also contain a recommendation system based on preference information and/or keyword search. Background: The Bern University of the Arts has inherited a large collection of about 15'000 hours of bootleg live opera recordings. Most of these recordings are unique, and many individual recordings rather long (up to 3-4 hours), hence the idea of segmenting the recordings so as to allow for the creation of semantical links between segments to enhance the possibilities of collectively exploring the collection. In our fast-moving times, drawing on Core Idea: Users engaging in “active” listening leave semantic traces behind that can be used as a resource to guide further exploration of the collection, both by themselves and by third parties. The approach can be used for an entire spectrum of users, ranging from occasional opera listeners, through opera amateurs, to interpretation researchers. The tool can be used as a collaborative tagging platform among research teams or within citizen science settings. By putting the focus on the listeners and their personal reaction to the audio segments, the perspective of analysis can be switched to the user, e.g. by creating typologies or clusterings of listening tastes or by using the approach for match-making in social settings. Demo Video Opera Forever Proof of Concept Opera Forever (demo application) A first proof of concept was developed at the Swiss Open Cultural Data Hackathon 2019 in Sion and contains the following features: The user can browse through and listen to the recordings of different performances of the same opera. The individual recordings are segmented into their different parts. By using simple swiping gestures, the user can navigate between the individual segments of the same recording (swiping left or right) or between different recordings (swiping up or down) - the swiping is not yet implemented, but you can click on the respective arrows. For each segment, the user can indicate to what extent they like that particular segment (1 to 5 stars). - not implemented yet Based on this information, individual preference lists and collective hit-parades are generated. - not implemented yet Also, it will be possible to cluster users according to their musical taste, which opens up the possibility to match users based on their musical taste or to build recommendation systems. not implemented yet Data Metadata: Ehrenreich Collection Database Audio Files: Digitized audio recordings from the Ehrenreich Collection (currently not available online; many of them presenting copyright issues) Photographs of artists: Taken from a variety of websites; most of them presenting copyright issues. Documentation Google Doc with Notes Team Birk Weiberg (birk) Dominik Sievi (dsievi) Beat Estermann (beat_estermann) Pia Viviani (pia) Oleg Lavrovsky (loleg) Kenny Floria (paulkc) Contact: beat.estermann@bfh.ch concept, dev, design, glam
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="opera_forever">Opera Forever</h2> <div class="level2"> <p> <strong>Opera Forever</strong> is an online collaboration platform and social networking site to collectively explore large amounts of opera recordings. </p> <p> The platform allows users to tag audio sequences with various types of semantics, such as personal preference, emotional reaction, specific musical features, technical issues, etc. Through the analysis of personal preference and/or emotional reaction to specific audio sequences, a characterization of personal listening tastes will be possible, and people with similar (or very dissimilar) tastes can be matched. The platform will also contain a recommendation system based on preference information and/or keyword search. </p> <p> <strong>Background:</strong> The Bern University of the Arts has inherited a large collection of about 15'000 hours of bootleg live opera recordings. Most of these recordings are unique, and many individual recordings rather long (up to 3-4 hours), hence the idea of segmenting the recordings so as to allow for the creation of semantical links between segments to enhance the possibilities of collectively exploring the collection. In our fast-moving times, drawing on </p> <p> <strong>Core Idea:</strong> Users engaging in active listening leave semantic traces behind that can be used as a resource to guide further exploration of the collection, both by themselves and by third parties. The approach can be used for an entire spectrum of users, ranging from occasional opera listeners, through opera amateurs, to interpretation researchers. The tool can be used as a collaborative tagging platform among research teams or within citizen science settings. By putting the focus on the listeners and their personal reaction to the audio segments, the perspective of analysis can be switched to the user, e.g. by creating typologies or clusterings of listening tastes or by using the approach for match-making in social settings. </p> </div> <h3 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="demo_video">Demo Video</h3> <div class="level3"> <iframe allowfullscreen="" class="vshare__none" frameborder="0" height="293" scrolling="no" src="//player.vimeo.com/video/358615682" width="520">Opera Forever</iframe> </div> <h3 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="proof_of_concept">Proof of Concept</h3> <div class="level3"> <p> <a class="urlextern" href="https://opera.now.sh" rel="nofollow" title="https://opera.now.sh">Opera Forever (demo application)</a> </p> <p> A first proof of concept was developed at the Swiss Open Cultural Data Hackathon 2019 in Sion and contains the following features: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> The user can browse through and listen to the recordings of different performances of the same opera.</div> </li> <li class="level1"><div class="li"> The individual recordings are segmented into their different parts.</div> </li> <li class="level1"><div class="li"> By using simple swiping gestures, the user can navigate between the individual segments of the same recording (swiping left or right) or between different recordings (swiping up or down) - <em>the swiping is not yet implemented, but you can click on the respective arrows.</em> </div> </li> <li class="level1"><div class="li"> For each segment, the user can indicate to what extent they like that particular segment (1 to 5 stars). - <em>not implemented yet </em></div> </li> <li class="level1"><div class="li"> Based on this information, individual preference lists and collective hit-parades are generated. - <em>not implemented yet </em></div> </li> <li class="level1"><div class="li"> Also, it will be possible to cluster users according to their musical taste, which opens up the possibility to match users based on their musical taste or to build recommendation systems. <em>not implemented yet</em></div> </li> </ul> </div> <h2 class="sectionedit4 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Metadata: <a class="urlextern" href="https://old.datahub.io/dataset/ehrenreich-collection-database" rel="nofollow" title="https://old.datahub.io/dataset/ehrenreich-collection-database">Ehrenreich Collection Database</a></div> </li> <li class="level1"><div class="li"> Audio Files: Digitized audio recordings from the Ehrenreich Collection (currently not available online; many of them presenting copyright issues)</div> </li> <li class="level1"><div class="li"> Photographs of artists: Taken from a variety of websites; most of them presenting copyright issues.</div> </li> </ul> </div> <h3 class="sectionedit5 page-header pb-3 mb-4 mt-5" id="documentation">Documentation</h3> <div class="level3"> <p> <a class="urlextern" href="https://docs.google.com/document/d/1C1plxqo_lOGWNXj5uEcAydmx_ZePvUio-UY5XOKvBlE/edit" rel="nofollow" title="https://docs.google.com/document/d/1C1plxqo_lOGWNXj5uEcAydmx_ZePvUio-UY5XOKvBlE/edit">Google Doc with Notes</a> </p> </div> <h2 class="sectionedit6 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Birk Weiberg (<a class="wikilink2" href="/wiki/user:birk" rel="nofollow" title="user:birk">birk</a>)</div> </li> <li class="level1"><div class="li"> Dominik Sievi (<a class="wikilink2" href="/wiki/user:dsievi" rel="nofollow" title="user:dsievi">dsievi</a>)</div> </li> <li class="level1"><div class="li"> Beat Estermann (<a class="wikilink1" href="/wiki/user:beat_estermann" title="user:beat_estermann">beat_estermann</a>)</div> </li> <li class="level1"><div class="li"> Pia Viviani (<a class="wikilink2" href="/wiki/user:pia" rel="nofollow" title="user:pia">pia</a>)</div> </li> <li class="level1"><div class="li"> Oleg Lavrovsky (<a class="wikilink1" href="/wiki/user:loleg" title="user:loleg">loleg</a>)</div> </li> <li class="level1"><div class="li"> Kenny Floria (<a class="wikilink2" href="/wiki/user:paulkc" rel="nofollow" title="user:paulkc">paulkc</a>)</div> </li> </ul> <p> Contact: beat.estermann@bfh.ch </p> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:glam?do=showtag&tag=glam" rel="tag" title="tag:glam"><span class="iconify" data-icon="mdi:tag-text-outline"></span> glam</a> </span></div> </div>
GLAM mix'n'hack 2019
2019-09-22 11:34:00
Time Gazer
TimeGazer Welcome to TimeGazer: A time-traveling photo booth enabling you to send greetings from historical postcards. Based on the wonderful “Postcards from Valais (1890 - 1950)” dataset, consisting of nearly 4000 historic postcards of Valais, we create a prototype for Mixed-Reality photo booth. Choose a historic postcard as a background and a person will be style-transferred virtually onto the postcard. Photobomb a historical postcard A photo booth for time traveling send greetings from the poster virtually enter the historical postcard Mockup of the process. Based on the wonderful “Postcards from Valais (1890 - 1950)” dataset, consisting of nearly 4000 historic postcards of Valais, we create a prototype for Mixed-Reality photo booth. One can choose a historic postcard as a background and a person will be style-transferred virtually onto the postcard. Potentially with VR-trackerified things to add choosable objects virtually into the scene. Technology This project is roughly based on a project from last year, which resulted in an active research project at Databases and Information Systems group of the University of Basel: VIRTUE. Hence, we use a similar setup: HTC Vive Pro VR-Headset Unity Style Transfer - Styling Images with Convolutional Neural Networks Results Website (password : Valais) Video Instagram account with the pictures taken Project Blue screen Printer box Standard box on MakerCase: Modified for the input of paper and output of postcard: The SVG and DXF box project files. Data Quote from the data introduction page: A collection of 3900 postcards from Valais. Some highlights are churches, cable cars, landscapes and traditional costumes. Source: Musées cantonaux du Valais – Musée d’histoire https://opendata.swiss/en/dataset/postcards-from-valais-1890-1950 Team Dr. Ivan Giangreco Dr. Johann Roduit Lionel Walter Loris Sauter Luca Palli Ralph Gasser concept, glam, tourism
<h1 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="timegazer">TimeGazer</h1> <div class="level1"> <p> Welcome to TimeGazer: A time-traveling photo booth enabling you to send greetings from historical postcards. </p> <p> Based on the wonderful Postcards from Valais (1890 - 1950) dataset, consisting of nearly 4000 historic postcards of Valais, we create a prototype for Mixed-Reality photo booth. </p> <p> Choose a historic postcard as a background and a person will be style-transferred virtually onto the postcard. </p> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="photobomb_a_historical_postcard">Photobomb a historical postcard</h2> <div class="level2"> <blockquote><div class="no"> A photo booth for time traveling<br> send greetings from the poster<br> virtually enter the historical postcard</br></br></div></blockquote> <p> <a class="media" href="/wiki/_detail/project:photo_2019-09-06_17-58-57.jpg?id=project%3Atime_gazer" title="project:photo_2019-09-06_17-58-57.jpg"><img alt="" class="mediacenter img-responsive" src="/wiki/_media/project:photo_2019-09-06_17-58-57.jpg"/></a> Mockup of the process. </p> <p> Based on the wonderful Postcards from Valais (1890 - 1950) <a class="urlextern" href="https://opendata.swiss/en/dataset/postcards-from-valais-1890-1950" rel="nofollow" title="https://opendata.swiss/en/dataset/postcards-from-valais-1890-1950">dataset</a>, consisting of nearly 4000 historic postcards of Valais, we create a prototype for Mixed-Reality photo booth. One can choose a historic postcard as a background and a person will be style-transferred virtually onto the postcard. </p> <p> Potentially with VR-trackerified things to add choosable objects virtually into the scene. </p> </div> <h2 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="technology">Technology</h2> <div class="level2"> <p> This project is roughly based on a <a class="urlextern" href="http://make.opendata.ch/wiki/project:virtual_3d_exhibition" rel="nofollow" title="http://make.opendata.ch/wiki/project:virtual_3d_exhibition">project</a> from last year, which resulted in an active research project at <a class="urlextern" href="https://dbis.dmi.unibas.ch/" rel="nofollow" title="https://dbis.dmi.unibas.ch/">Databases and Information Systems</a> group of the <a class="urlextern" href="https://www.unibas.ch/de" rel="nofollow" title="https://www.unibas.ch/de">University of Basel</a>: <a class="urlextern" href="https://dbis.dmi.unibas.ch/research/projects/virtue/" rel="nofollow" title="https://dbis.dmi.unibas.ch/research/projects/virtue/">VIRTUE</a>. Hence, we use a similar setup: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <a class="urlextern" href="https://www.vive.com/eu/product/vive-pro/" rel="nofollow" title="https://www.vive.com/eu/product/vive-pro/">HTC Vive Pro</a> VR-Headset</div> </li> <li class="level1"><div class="li"> <a class="urlextern" href="https://unity3d.com" rel="nofollow" title="https://unity3d.com">Unity</a></div> </li> <li class="level1"><div class="li"> <a class="urlextern" href="https://towardsdatascience.com/style-transfer-styling-images-with-convolutional-neural-networks-7d215b58f461" rel="nofollow" title="https://towardsdatascience.com/style-transfer-styling-images-with-convolutional-neural-networks-7d215b58f461">Style Transfer - Styling Images with Convolutional Neural Networks</a></div> </li> </ul> </div> <h2 class="sectionedit4 page-header pb-3 mb-4 mt-5" id="results">Results</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <a class="urlextern" href="https://time-gazer.squarespace.com/" rel="nofollow" title="https://time-gazer.squarespace.com/">Website</a> (password : Valais)</div> </li> <li class="level1"><div class="li"> <a class="urlextern" href="https://vimeo.com/358591907" rel="nofollow" title="https://vimeo.com/358591907">Video</a></div> </li> <li class="level1"><div class="li"> <a class="urlextern" href="https://www.instagram.com/timegazervalais/" rel="nofollow" title="https://www.instagram.com/timegazervalais/">Instagram account with the pictures taken</a></div> </li> </ul> </div> <h2 class="sectionedit5 page-header pb-3 mb-4 mt-5" id="project">Project</h2> <div class="level2"> </div> <h3 class="sectionedit6 page-header pb-3 mb-4 mt-5" id="blue_screen">Blue screen</h3> <div class="level3"> <p> <a class="media" href="/wiki/_detail/project:bluescreen-project.png?id=project%3Atime_gazer" title="project:bluescreen-project.png"><img alt="" class="mediacenter img-responsive" src="/wiki/_media/project:bluescreen-project.png"/></a> </p> </div> <h3 class="sectionedit7 page-header pb-3 mb-4 mt-5" id="printer_box">Printer box</h3> <div class="level3"> <p> Standard box on <a class="urlextern" href="https://www.makercase.com/" rel="nofollow" title="https://www.makercase.com/">MakerCase</a>: <a class="media" href="/wiki/_detail/project:printer-box_makercase-optimized.png?id=project%3Atime_gazer" title="project:printer-box_makercase-optimized.png"><img alt="" class="mediacenter img-responsive" src="/wiki/_media/project:printer-box_makercase-optimized.png"/></a> </p> <p> Modified for the input of paper and output of postcard: <a class="media" href="/wiki/_detail/project:printer-box-full.png?id=project%3Atime_gazer" title="project:printer-box-full.png"><img alt="" class="mediacenter img-responsive" src="/wiki/_media/project:printer-box-full.png"/></a> </p> <p> The <a class="urlextern" href="https://drive.google.com/open?id=1ul3G1U09DEGw6OYKdTABR_meCK4bV1Cx" rel="nofollow" title="https://drive.google.com/open?id=1ul3G1U09DEGw6OYKdTABR_meCK4bV1Cx">SVG</a> and <a class="urlextern" href="https://drive.google.com/open?id=1Fc4NClhAYqUZBTUfILFUN7s1CKwjG2M7" rel="nofollow" title="https://drive.google.com/open?id=1Fc4NClhAYqUZBTUfILFUN7s1CKwjG2M7">DXF</a> box project files. </p> </div> <h2 class="sectionedit8 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <p> Quote from the data introduction page: </p> <blockquote><div class="no"> A collection of 3900 postcards from Valais. Some highlights are churches, cable cars, landscapes and traditional costumes.<br> Source: Muses cantonaux du Valais Muse dhistoire </br></div></blockquote> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <a class="urlextern" href="https://opendata.swiss/en/dataset/postcards-from-valais-1890-1950" rel="nofollow" title="https://opendata.swiss/en/dataset/postcards-from-valais-1890-1950">https://opendata.swiss/en/dataset/postcards-from-valais-1890-1950</a></div> </li> </ul> </div> <h2 class="sectionedit9 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Dr. Ivan Giangreco</div> </li> <li class="level1"><div class="li"> Dr. Johann Roduit</div> </li> <li class="level1"><div class="li"> Lionel Walter</div> </li> <li class="level1"><div class="li"> Loris Sauter</div> </li> <li class="level1"><div class="li"> <a class="wikilink2" href="/wiki/user:lpalli" rel="nofollow" title="user:lpalli">Luca</a> Palli</div> </li> <li class="level1"><div class="li"> Ralph Gasser</div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:glam?do=showtag&tag=glam" rel="tag" title="tag:glam"><span class="iconify" data-icon="mdi:tag-text-outline"></span> glam</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:tourism?do=showtag&tag=tourism" rel="tag" title="tag:tourism"><span class="iconify" data-icon="mdi:tag-text-outline"></span> tourism</a> </span></div> </div>
GLAM mix'n'hack 2019
2019-09-24 01:34:00
Human Name Creativity
Human Name Creativity Following the last years project about dog names Dog Name Creativity Survey of New York City Dog Name Creativity Survey of New York City. The focus this year was on human names. The swiss post provides datasets with the top 5 names from each postal code. The goal was again to create a creativity index. But this year, under the motto of user involvement with the option to enter your own name, set the language your name is from and to see yourself in the ranking. The datasets are not perfect for this task, because they don’t contain all the names, only the top 5 per postal code. So the user has a high chance to get a “score-buff” for uniqueness. Nevertheless it is a fun project. Unfortunately it wasn’t finished until the end of the Hackathon, no UI, but here's the last draft version of the code: import pandas as pd HaufeD_ = {"e":1,"n":2,"i":3,"r":4,"s":5,"-":5,"t":6,"a":7,"d":8,"h":9,"u":10,"l":11,"c":12,"g":13,"m":14,"o":15,"b":16, \ "w":17,"f":18,"k":19,"z":20,"v":21,"p":22,"ü":23,"ä":24,"ö":25,"j":26,"x":27,"y":28,"q":29} HaufeF_ = {"e":1,"a":2,"s":3,"t":4,"i":5,"-":5,"r":6,"n":7,"u":8,"l":9,"o":10,"d":11,"m":12,"c":13,"p":14,"é":15,"v":16, \ "h":17,"g":18,"f":19,"b":20,"q":21,"j":22,"à":23,"x":24,"è":25,"ê":26,"z":27,"y":28,"k":29,"ô":29,"û":29,"w":29 \ ,"â":29,"î":29,"ü":29,"ù":29,"ë":29,"Œ":29,"ç":29,"ï":29} #HaufeI_ = landics = {"d":HaufeD_,"f":HaufeF_} def KreaWert(name_,lan): dic = landics[lan] name_ = str(name_) wert_ = 0 for letter in str.lower(name_): temp_ = 0 if letter in dic : temp_ += dic[letter] wert_ += temp_ else: temp_ += 20 wert_ += temp_ try: H_[name_] wert_ = wert_* ((Hmax-H_[name_])/(Hmax-1)*5 + 0.2) except KeyError as exception: pass if len(name_) < (DNL-2) or len(name_) > (DNL+2): wert_ = wert_/10*8 return round(wert_,1) df = pd.read_csv("vornamen_proplz.csv", sep = ",") df["vorname"] = df["vorname"].str.strip() insgeNamLan_ = 0 for name in df["vorname"]: insgeNamLan_ += len(str(name)) #unkreativitätsrange = weniger als 4 / mehr als 8 DNL = round(insgeNamLan_ / len(df["vorname"])) #Häufigkeit der Namen = H_ H_ = {} counter = 0 for name in df["vorname"]: if name in H_: H_[name] += df["anzahl"][counter] counter += 1 else: H_[name] = df["anzahl"][counter] counter +=1 sortH_ = sorted(H_.values()) Hmax = sortH_[len(sortH_)-1] Hmin = sortH_[0] lan = input("Set the language of your name (d/i/f): ") name_ = input("What is your first name? ") print(KreaWert(name_,lan)) Data Vor- und Nachnamen pro Postleitzahl: https://opendata.swiss/de/dataset/vornamen-pro-plz https://opendata.swiss/de/dataset/nachnamen-pro-plz Team dsievi concept, dev, design, data, expert
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="human_name_creativity">Human Name Creativity</h2> <div class="level2"> <p> Following the last years project about dog names Dog Name Creativity Survey of New York City <a class="wikilink1" href="/wiki/project:dncsonyc" title="project:dncsonyc">Dog Name Creativity Survey of New York City</a>. The focus this year was on human names. The swiss post provides datasets with the top 5 names from each postal code. The goal was again to create a creativity index. But this year, under the motto of user involvement with the option to enter your own name, set the language your name is from and to see yourself in the ranking. The datasets are not perfect for this task, because they dont contain all the names, only the top 5 per postal code. So the user has a high chance to get a score-buff for uniqueness. Nevertheless it is a fun project. </p> <p> Unfortunately it wasnt finished until the end of the Hackathon, no UI, but here's the last draft version of the code: </p> <pre class="code">import pandas as pd HaufeD_ = {"e":1,"n":2,"i":3,"r":4,"s":5,"-":5,"t":6,"a":7,"d":8,"h":9,"u":10,"l":11,"c":12,"g":13,"m":14,"o":15,"b":16, \ "w":17,"f":18,"k":19,"z":20,"v":21,"p":22,"":23,"":24,"":25,"j":26,"x":27,"y":28,"q":29} HaufeF_ = {"e":1,"a":2,"s":3,"t":4,"i":5,"-":5,"r":6,"n":7,"u":8,"l":9,"o":10,"d":11,"m":12,"c":13,"p":14,"":15,"v":16, \ "h":17,"g":18,"f":19,"b":20,"q":21,"j":22,"":23,"x":24,"":25,"":26,"z":27,"y":28,"k":29,"":29,"":29,"w":29 \ ,"":29,"":29,"":29,"":29,"":29,"":29,"":29,"":29} #HaufeI_ = landics = {"d":HaufeD_,"f":HaufeF_} def KreaWert(name_,lan): dic = landics[lan] name_ = str(name_) wert_ = 0 for letter in str.lower(name_): temp_ = 0 if letter in dic : temp_ += dic[letter] wert_ += temp_ else: temp_ += 20 wert_ += temp_ try: H_[name_] wert_ = wert_* ((Hmax-H_[name_])/(Hmax-1)*5 + 0.2) except KeyError as exception: pass if len(name_) < (DNL-2) or len(name_) > (DNL+2): wert_ = wert_/10*8 return round(wert_,1) df = pd.read_csv("vornamen_proplz.csv", sep = ",") df["vorname"] = df["vorname"].str.strip() insgeNamLan_ = 0 for name in df["vorname"]: insgeNamLan_ += len(str(name)) #unkreativittsrange = weniger als 4 / mehr als 8 DNL = round(insgeNamLan_ / len(df["vorname"])) #Hufigkeit der Namen = H_ H_ = {} counter = 0 for name in df["vorname"]: if name in H_: H_[name] += df["anzahl"][counter] counter += 1 else: H_[name] = df["anzahl"][counter] counter +=1 sortH_ = sorted(H_.values()) Hmax = sortH_[len(sortH_)-1] Hmin = sortH_[0] lan = input("Set the language of your name (d/i/f): ") name_ = input("What is your first name? ") print(KreaWert(name_,lan))</pre> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <p> Vor- und Nachnamen pro Postleitzahl: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <a class="urlextern" href="https://opendata.swiss/de/dataset/vornamen-pro-plz" rel="nofollow" title="https://opendata.swiss/de/dataset/vornamen-pro-plz">https://opendata.swiss/de/dataset/vornamen-pro-plz</a></div> </li> <li class="level1"><div class="li"> <a class="urlextern" href="https://opendata.swiss/de/dataset/nachnamen-pro-plz" rel="nofollow" title="https://opendata.swiss/de/dataset/nachnamen-pro-plz">https://opendata.swiss/de/dataset/nachnamen-pro-plz</a></div> </li> </ul> </div> <h2 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <a class="wikilink2" href="/wiki/user:dsievi" rel="nofollow" title="user:dsievi">dsievi</a></div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:data?do=showtag&tag=needs%3Adata" rel="tag" title="needs:data"><span class="iconify" data-icon="mdi:tag-text-outline"></span> data</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:expert?do=showtag&tag=needs%3Aexpert" rel="tag" title="needs:expert"><span class="iconify" data-icon="mdi:tag-text-outline"></span> expert</a> </span></div> </div>
Open Cultural Data Hackathon
2018-10-28 12:59:00
Art on Paper Gallery
Art on Paper Gallery We develop a gallery app for browsing art works on paper. For the prototype we use a dataset sample delivered from the Collection Online of the Graphische Sammlung ETH Zurich. In our app the user can find the digital images of the prints and drawings, gets metadata information about the different techniques and other details. The app invites the user to browse from one art work to the other, following different paths such as the same technique, the same artist, the same subject and so on. Challenge To use a Collection Online properly the user needs previous knowledge. Many people just love art and are interested but no experts. User Especially this group of people is invited to explore our large collection in an interactive journey. Goals The Art on Paper Gallery App enables the user to jump from one artwork to another in an associative way. It offers suggestions following different categories, such as the artist, technique, etc. It allows social interaction with the possibility to like, share and comment an artwork Artworks can be arranged according to relevance, number of clicks etc. This again allows Collections or Museums to evaluate the user interests and trends Code The code is available at the following link: https://github.com/DominikStefancik/Art-on-Paper-Gallery-App. Example of a possible Design Data Graphische Sammlung ETH Zurich, Collection Online, sample dataset with focus on different techniques of printmaking and drawing Team Dominik Štefančik, Software Engineer Graphische Sammlung ETH Zurich, Susanne Pollack, Ann-Kathrin Seyffer concept, dev, design, data, expert, glam
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="art_on_paper_gallery">Art on Paper Gallery</h2> <div class="level2"> <p> We develop a gallery app for browsing art works on paper. For the prototype we use a dataset sample delivered from the Collection Online of the Graphische Sammlung ETH Zurich. In our app the user can find the digital images of the prints and drawings, gets metadata information about the different techniques and other details. The app invites the user to browse from one art work to the other, following different paths such as the same technique, the same artist, the same subject and so on. </p> <p> <strong>Challenge</strong> </p> <p> To use a Collection Online properly the user needs previous knowledge. Many people just love art and are interested but no experts. </p> <p> <strong>User</strong> </p> <p> Especially this group of people is invited to explore our large collection in an interactive journey. </p> <p> <strong>Goals</strong> </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> The Art on Paper Gallery App enables the user to jump from one artwork to another in an associative way. It offers suggestions following different categories, such as the artist, technique, etc. </div> </li> <li class="level1"><div class="li"> It allows social interaction with the possibility to like, share and comment an artwork</div> </li> <li class="level1"><div class="li"> Artworks can be arranged according to relevance, number of clicks etc.</div> </li> <li class="level1"><div class="li"> This again allows Collections or Museums to evaluate the user interests and trends</div> </li> </ul> <p> <strong>Code</strong> </p> <p> The code is available at the following link: <a class="urlextern" href="https://github.com/DominikStefancik/Art-on-Paper-Gallery-App" rel="nofollow" title="https://github.com/DominikStefancik/Art-on-Paper-Gallery-App">https://github.com/DominikStefancik/Art-on-Paper-Gallery-App</a>. </p> <p> <img alt="" class="mediacenter img-responsive" src="/wiki/_media/project:artonpapergallery.jpg"/> </p> <p> <strong>Example of a possible Design</strong> </p> <p> <img alt="" class="mediacenter img-responsive" src="/wiki/_media/project:artonpapergallery_start.jpg"/> </p> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Graphische Sammlung ETH Zurich, Collection Online, sample dataset with focus on different techniques of printmaking and drawing</div> </li> </ul> </div> <h2 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Dominik tefanik, Software Engineer</div> </li> <li class="level1"><div class="li"> Graphische Sammlung ETH Zurich, Susanne Pollack, Ann-Kathrin Seyffer</div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:data?do=showtag&tag=needs%3Adata" rel="tag" title="needs:data"><span class="iconify" data-icon="mdi:tag-text-outline"></span> data</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:expert?do=showtag&tag=needs%3Aexpert" rel="tag" title="needs:expert"><span class="iconify" data-icon="mdi:tag-text-outline"></span> expert</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:glam?do=showtag&tag=glam" rel="tag" title="tag:glam"><span class="iconify" data-icon="mdi:tag-text-outline"></span> glam</a> </span></div> </div>
Open Cultural Data Hackathon
2018-10-28 13:25:00
Artify
Artify Explore the collection in a new, interessting way You have to find objects, which have similar metadata and try to match them. The displayed objects are (semi-)randomly selected from a dataset (eg. from SNM). From the metadata of the starting object, the app will search for three other objects: One which matches in 2+ metadata tags One which matches in 1 metadata tag One which is completly random. If you choose the right one, the app will display three new objects accordingly to the way explained above. Tags used from the datasets: OBJEKT Klassifikation (x) OBJEKT Webtext OBJEKT Datierung (x) OBJEKT → Herstellung (x) OBJEKT → Herkunft (x) (x) = used for matching To Do Datasets are too divers; in some cases there is no match. Datasets need to be prepared. The tag “Klassifikation” is too specific The tags “Herstellung” and “Herkunft” are often empty or not consistent. The gaming aspect needs to be implemented Use case There are various cases, where the app could be used. It mainly depends on the datasets you use: Explore hidden objects of a museum collection Train students to identify art periods Find connections between museums, which are not obvious (e.g. art and historical objects) Data Democase: SNM https://opendata.swiss/en/organization/schweizerisches-nationalmuseum-snm –> Build with two sets: Technologie und Brauchtum / Kutschen & Schlitten & Fahrzeuge Links Github: https://github.com/zack17/ocdh2018 Tech. Demo: https://zack17.github.io/ocdh2018/ Design Demo (not functional): https://tempestas.ch/artify/ Team Micha Reiser Jacqueline Martinelli Anastasiya Korotkova Dominic Studer Yaw Lam concept, dev, design, data, expert, glam
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="artify">Artify</h2> <div class="level2"> <p> <img alt="" class="mediacenter img-responsive" src="/wiki/_media/project:titel3.jpg?w=250&tok=c81b3f" width="250"/> </p> <p> Explore the collection in a new, interessting way </p> <p> You have to find objects, which have similar metadata and try to match them. The displayed objects are (semi-)randomly selected from a dataset (eg. from SNM). From the metadata of the starting object, the app will search for three other objects: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> One which matches in 2+ metadata tags</div> </li> <li class="level1"><div class="li"> One which matches in 1 metadata tag</div> </li> <li class="level1"><div class="li"> One which is completly random.</div> </li> </ul> <p> If you choose the right one, the app will display three new objects accordingly to the way explained above. </p> <p> Tags used from the datasets: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> OBJEKT Klassifikation (x)</div> </li> <li class="level1"><div class="li"> OBJEKT Webtext</div> </li> <li class="level1"><div class="li"> OBJEKT Datierung (x)</div> </li> <li class="level1"><div class="li"> OBJEKT Herstellung (x)</div> </li> <li class="level1"><div class="li"> OBJEKT Herkunft (x)</div> </li> </ul> <p> (x) = used for matching </p> </div> <h4 id="to_do">To Do</h4> <div class="level4"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Datasets are too divers; in some cases there is no match. Datasets need to be prepared.</div> </li> <li class="level1"><div class="li"> The tag Klassifikation is too specific</div> </li> <li class="level1"><div class="li"> The tags Herstellung and Herkunft are often empty or not consistent.</div> </li> </ul> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> The gaming aspect needs to be implemented</div> </li> <li class="level1"><div class="li"> </div> </li> </ul> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="use_case">Use case</h2> <div class="level2"> <p> There are various cases, where the app could be used. It mainly depends on the datasets you use: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Explore hidden objects of a museum collection</div> </li> <li class="level1"><div class="li"> Train students to identify art periods</div> </li> <li class="level1"><div class="li"> Find connections between museums, which are not obvious (e.g. art and historical objects)</div> </li> </ul> </div> <h2 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <p> Democase: </p> <p> SNM <a class="urlextern" href="https://opendata.swiss/en/organization/schweizerisches-nationalmuseum-snm" rel="nofollow" title="https://opendata.swiss/en/organization/schweizerisches-nationalmuseum-snm">https://opendata.swiss/en/organization/schweizerisches-nationalmuseum-snm</a> </p> <p> > Build with two sets: Technologie und Brauchtum / Kutschen & Schlitten & Fahrzeuge </p> </div> <h2 class="sectionedit4 page-header pb-3 mb-4 mt-5" id="links">Links</h2> <div class="level2"> <p> Github: <a class="urlextern" href="https://github.com/zack17/ocdh2018" rel="nofollow" title="https://github.com/zack17/ocdh2018">https://github.com/zack17/ocdh2018</a> </p> <p> Tech. Demo: <a class="urlextern" href="https://zack17.github.io/ocdh2018/" rel="nofollow" title="https://zack17.github.io/ocdh2018/">https://zack17.github.io/ocdh2018/</a> </p> <p> Design Demo (not functional): <a class="urlextern" href="https://tempestas.ch/artify/" rel="nofollow" title="https://tempestas.ch/artify/">https://tempestas.ch/artify/</a> </p> </div> <h2 class="sectionedit5 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Micha Reiser</div> </li> <li class="level1"><div class="li"> Jacqueline Martinelli</div> </li> <li class="level1"><div class="li"> Anastasiya Korotkova</div> </li> <li class="level1"><div class="li"> Dominic Studer</div> </li> <li class="level1"><div class="li"> Yaw Lam</div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:data?do=showtag&tag=needs%3Adata" rel="tag" title="needs:data"><span class="iconify" data-icon="mdi:tag-text-outline"></span> data</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:expert?do=showtag&tag=needs%3Aexpert" rel="tag" title="needs:expert"><span class="iconify" data-icon="mdi:tag-text-outline"></span> expert</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:glam?do=showtag&tag=glam" rel="tag" title="tag:glam"><span class="iconify" data-icon="mdi:tag-text-outline"></span> glam</a> </span></div> </div>
Open Cultural Data Hackathon
2018-10-29 14:51:00
Ask the Artist
Ask the Artist The project idea is to create a voice assistance with the identity of an artist. In our case, we created a demo of the famous Swiss painter Ferdinand Hodler. That is to say, the voice assistance is nor Siri nor Alexa. Instead, it is an avatar of Ferdinand Hodler who can answer your questions about his art and his life. You can directly interact with the program by talking, as what you would do normally in your daily life. You can ask it all kinds of questions about Ferdinand Hodler, e.g.: When did you start painting? Who taught you painting? Can you show me some of your paintings? Where can I find an exhibition with your artworks? By talking to the digital image of the artist directly, we aim to bring the art closer to people's daily life, in a direct, intuitive and hopefully interesting way. As you know, museum audiences need to keep quiet which is not so friendly to children. Also, for people with special needs, like the visually dispaired, and people without professional knowledge about art, it is not easy for them to enjoy the museum visit. To make art accessible to more people, a voice assistance can help with solving those barriers. If you asked the difference between our product with Amazon's Alexa or Apple's Siri, there are two major points: The user can interact with the artist in a direct way: talking to each other. In other applications, the communication happened by Alexa or Siri to deliver the message as the 3rd party channel. In our case, users can have immersive and better user experienceand they will feel like if they were talking to an artist friend, not an application. The other difference is that the answers to the questions are preset. The essence of how Alexa or Siri works is that they search the question asked by users online and read the returned search results out. In that case, we cannot make sure that the answer is correct and/or suitable. However, in our case, all the answers are coming from reliable data sets of museum and other research institutions, and have been verified and proofread by the art experts. Thus, we can proudly say, the answers from us are reliable and correct. People can use it as a tool to educate children or as visiting assistance in the exhibition. Video demo: Data List and link your actual and ideal data sources. Kunsthaus Zürich ⭐️ List of all Exhibitions at Kunsthaus Zürich SIK-ISEA ⭐️ Artist data from the SIKART Lexicon on art in Switzerland Swiss National Museum ⭐️ Representative sample from the Paintings & Sculptures Collection (images and metadata) Wikimedia Switzerland Team Angelica Barbara Anlin (lianganlin@foxmail.com) concept, dev, design, data, expert, glam
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="ask_the_artist">Ask the Artist</h2> <div class="level2"> <p> The project idea is to create a voice assistance with the identity of an artist. In our case, we created a demo of the famous Swiss painter Ferdinand Hodler. That is to say, the voice assistance is nor Siri nor Alexa. Instead, it is an avatar of Ferdinand Hodler who can answer your questions about his art and his life. </p> <p> You can directly interact with the program by talking, as what you would do normally in your daily life. You can ask it all kinds of questions about Ferdinand Hodler, e.g.: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> When did you start painting?</div> </li> <li class="level1"><div class="li"> Who taught you painting?</div> </li> <li class="level1"><div class="li"> Can you show me some of your paintings?</div> </li> <li class="level1"><div class="li"> Where can I find an exhibition with your artworks?</div> </li> </ul> <p> By talking to the digital image of the artist directly, we aim to bring the art closer to people's daily life, in a direct, intuitive and hopefully interesting way. </p> <p> As you know, museum audiences need to keep quiet which is not so friendly to children. Also, for people with special needs, like the visually dispaired, and people without professional knowledge about art, it is not easy for them to enjoy the museum visit. To make art accessible to more people, a voice assistance can help with solving those barriers. </p> <p> If you asked the difference between our product with Amazon's Alexa or Apple's Siri, there are two major points: </p> <ol class=" fix-media-list-overlap"> <li class="level1"><div class="li"> The user can interact with the artist in a direct way: talking to each other. In other applications, the communication happened by Alexa or Siri to deliver the message as the 3rd party channel. In our case, users can have immersive and better user experienceand they will feel like if they were talking to an artist friend, not an application.</div> </li> </ol> <ol class=" fix-media-list-overlap"> <li class="level1"><div class="li"> The other difference is that the answers to the questions are preset. The essence of how Alexa or Siri works is that they search the question asked by users online and read the returned search results out. In that case, we cannot make sure that the answer is correct and/or suitable. However, in our case, all the answers are coming from reliable data sets of museum and other research institutions, and have been verified and proofread by the art experts. Thus, we can proudly say, the answers from us are reliable and correct. People can use it as a tool to educate children or as visiting assistance in the exhibition. </div> </li> </ol> <p> <a class="media" href="/wiki/_detail/project:ask_the_artist.jpg?id=project%3Aweb_exhibition" title="project:ask_the_artist.jpg"><img alt="" class="media img-responsive" src="/wiki/_media/project:ask_the_artist.jpg?w=200&tok=4a7044" width="200"/></a> <a class="media" href="/wiki/_detail/project:121211111.jpg?id=project%3Aweb_exhibition" title="project:121211111.jpg"><img alt="" class="media img-responsive" src="/wiki/_media/project:121211111.jpg?w=200&tok=cf36f8" width="200"/></a> <a class="media" href="/wiki/_detail/project:11111111.jpg?id=project%3Aweb_exhibition" title="project:11111111.jpg"><img alt="" class="media img-responsive" src="/wiki/_media/project:11111111.jpg?w=200&tok=0ebe45" width="200"/></a> </p> <p> Video demo: </p> <iframe allowfullscreen="" class="vshare__none" frameborder="0" height="239" scrolling="no" src="//www.youtube-nocookie.com/embed/DlABgOf0b8w" width="425"></iframe> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> List and link your actual and ideal data sources.</div> </li> </ul> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <strong>Kunsthaus Zrich</strong></div> </li> </ul> <p> List of all Exhibitions at Kunsthaus Zrich </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <strong>SIK-ISEA</strong></div> </li> </ul> <p> Artist data from the SIKART Lexicon on art in Switzerland </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <strong>Swiss National Museum</strong></div> </li> </ul> <p> Representative sample from the Paintings & Sculptures Collection (images and metadata) </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <strong>Wikimedia Switzerland</strong></div> </li> </ul> </div> <h2 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Angelica</div> </li> <li class="level1"><div class="li"> Barbara</div> </li> <li class="level1"><div class="li"> Anlin (lianganlin@foxmail.com)</div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:data?do=showtag&tag=needs%3Adata" rel="tag" title="needs:data"><span class="iconify" data-icon="mdi:tag-text-outline"></span> data</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:expert?do=showtag&tag=needs%3Aexpert" rel="tag" title="needs:expert"><span class="iconify" data-icon="mdi:tag-text-outline"></span> expert</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:glam?do=showtag&tag=glam" rel="tag" title="tag:glam"><span class="iconify" data-icon="mdi:tag-text-outline"></span> glam</a> </span></div> </div>
Open Cultural Data Hackathon
2018-10-28 15:42:00
Dog Name Creativity Survey of New York City
Dog Name Creativity Survey of New York City We started this project to see if art and cultural institutions in the environment have an impact on the creativity of dognames. This was not possible with the date from Zurich because the name-dataset does not contain information about the location and the dataset about the owners does not include the dognames. We choose to stick with our idea but used a different dataset: NYC Dog Licensing Dataset. The creativity of a name is measured by the frequency of each letter in the English language and gets +/- points according to the amount of dogs with the same name. The data for the cultural environment comes from Wikidata. After some data-cleaning with OpenRefine and failed attempts with OpenCalc we got the following code: import string import pandas as pd numbers_ = {"e":1,"t":2,"a":3,"o":4,"n":5,"i":6,"s":7,"h":8,"r":9,"l":10,"d":11,"u":12,"c":13,"m":14,"w":15,"y":16,"f":17,"g":18,"p":19,"b":20,"v":21,"k":22,"j":23,"x":24,"q":25,"z":26} name_list = [] def KreaWert(name_): name_ = str(name_) wert_ = 0 for letter in str.lower(name_): temp_ = 0 if letter in string.ascii_lowercase : temp_ += numbers_[letter] wert_ += temp_ if name_ in H_: wert_ = wert_* ((Hmax-H_[name_])/(Hmax-1)*5 + 0.2) return round(wert_,1) df = pd.read_csv("Vds3.csv", sep = ";") df["AnimalName"] = df["AnimalName"].str.strip() H_ = df["AnimalName"].value_counts() Hmax = max(H_) Hmin = min(H_) df["KreaWert"] = df["AnimalName"].map(KreaWert) df.to_csv("namen2.csv") dftemp = df[["AnimalName", "KreaWert"]].drop_duplicates().set_index("AnimalName") dftemp.to_csv("dftemp.csv") df3 = pd.DataFrame() df3["amount"] = H_ df3 = df3.join(dftemp, how="outer") df3.to_csv("data3.csv") df1 = round(df.groupby("Borough").mean(),2) df1.to_csv("data1.csv") df2 = round(df.groupby(["Borough","AnimalGender"]).mean(),2) df2.to_csv("data2.csv") Visualisations were made with D3: https://d3js.org/ Data Hundedaten der Stadt Zürich: https://opendata.swiss/de/dataset/hundenamen-aus-dem-hundebestand-der-stadt-zurich https://opendata.swiss/de/dataset/hundebestand-der-stadt-zurich NYC Dog Licensing Dataset: https://data.cityofnewyork.us/Health/NYC-Dog-Licensing-Dataset/nu7n-tubp Team Birk Weiberg Dominik Sievi concept, dev, design, data, expert, glam
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="dog_name_creativity_survey_of_new_york_city">Dog Name Creativity Survey of New York City</h2> <div class="level2"> <p> <a class="media" href="/wiki/_detail/project:dncrsesult1.png?id=project%3Adncsonyc" title="project:dncrsesult1.png"><img alt="How does the creativity of given dog names related to the amount of culture found in the different boroughs of New York City?" class="media img-responsive" src="/wiki/_media/project:dncrsesult1.png?w=600&tok=f2a1ba" title="How does the creativity of given dog names related to the amount of culture found in the different boroughs of New York City?" width="600"/></a> </p> <p> We started this project to see if art and cultural institutions in the environment have an impact on the creativity of dognames. This was not possible with the date from Zurich because the name-dataset does not contain information about the location and the dataset about the owners does not include the dognames. We choose to stick with our idea but used a different dataset: NYC Dog Licensing Dataset. </p> <p> The creativity of a name is measured by the frequency of each letter in the English language and gets +/- points according to the amount of dogs with the same name. The data for the cultural environment comes from Wikidata. </p> <p> After some data-cleaning with OpenRefine and failed attempts with OpenCalc we got the following code: </p> <pre class="code">import string import pandas as pd numbers_ = {"e":1,"t":2,"a":3,"o":4,"n":5,"i":6,"s":7,"h":8,"r":9,"l":10,"d":11,"u":12,"c":13,"m":14,"w":15,"y":16,"f":17,"g":18,"p":19,"b":20,"v":21,"k":22,"j":23,"x":24,"q":25,"z":26} name_list = [] def KreaWert(name_): name_ = str(name_) wert_ = 0 for letter in str.lower(name_): temp_ = 0 if letter in string.ascii_lowercase : temp_ += numbers_[letter] wert_ += temp_ if name_ in H_: wert_ = wert_* ((Hmax-H_[name_])/(Hmax-1)*5 + 0.2) return round(wert_,1) df = pd.read_csv("Vds3.csv", sep = ";") df["AnimalName"] = df["AnimalName"].str.strip() H_ = df["AnimalName"].value_counts() Hmax = max(H_) Hmin = min(H_) df["KreaWert"] = df["AnimalName"].map(KreaWert) df.to_csv("namen2.csv") dftemp = df[["AnimalName", "KreaWert"]].drop_duplicates().set_index("AnimalName") dftemp.to_csv("dftemp.csv") df3 = pd.DataFrame() df3["amount"] = H_ df3 = df3.join(dftemp, how="outer") df3.to_csv("data3.csv") df1 = round(df.groupby("Borough").mean(),2) df1.to_csv("data1.csv") df2 = round(df.groupby(["Borough","AnimalGender"]).mean(),2) df2.to_csv("data2.csv")</pre> <p> Visualisations were made with D3: <a class="urlextern" href="https://d3js.org/" rel="nofollow" title="https://d3js.org/">https://d3js.org/</a> </p> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <p> Hundedaten der Stadt Zrich: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <a class="urlextern" href="https://opendata.swiss/de/dataset/hundenamen-aus-dem-hundebestand-der-stadt-zurich" rel="nofollow" title="https://opendata.swiss/de/dataset/hundenamen-aus-dem-hundebestand-der-stadt-zurich">https://opendata.swiss/de/dataset/hundenamen-aus-dem-hundebestand-der-stadt-zurich</a></div> </li> <li class="level1"><div class="li"> <a class="urlextern" href="https://opendata.swiss/de/dataset/hundebestand-der-stadt-zurich" rel="nofollow" title="https://opendata.swiss/de/dataset/hundebestand-der-stadt-zurich">https://opendata.swiss/de/dataset/hundebestand-der-stadt-zurich</a></div> </li> </ul> <p> NYC Dog Licensing Dataset: </p> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <a class="urlextern" href="https://data.cityofnewyork.us/Health/NYC-Dog-Licensing-Dataset/nu7n-tubp" rel="nofollow" title="https://data.cityofnewyork.us/Health/NYC-Dog-Licensing-Dataset/nu7n-tubp">https://data.cityofnewyork.us/Health/NYC-Dog-Licensing-Dataset/nu7n-tubp</a></div> </li> </ul> </div> <h2 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Birk Weiberg</div> </li> <li class="level1"><div class="li"> Dominik Sievi</div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:data?do=showtag&tag=needs%3Adata" rel="tag" title="needs:data"><span class="iconify" data-icon="mdi:tag-text-outline"></span> data</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:expert?do=showtag&tag=needs%3Aexpert" rel="tag" title="needs:expert"><span class="iconify" data-icon="mdi:tag-text-outline"></span> expert</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/tag:glam?do=showtag&tag=glam" rel="tag" title="tag:glam"><span class="iconify" data-icon="mdi:tag-text-outline"></span> glam</a> </span></div> </div>
Open Cultural Data Hackathon
2018-10-27 20:36:00
Find Me an Exhibit
Find Me an Exhibit Are you ready to take up the challenge? Film categories of objects in the exhibition “History of Switzerland” running against the clock. The app displays one of several categories of exhibits that can be found in the exhibition (like “cloths”, “paintings” or “clocks”). Your job is to find a matching exhibit as quick as possible. You don't have much time, so hurry up! Best played on portable devices. The frontend of the app is based on the game “Emoji Scavenger Hunt”, the model is built with TensorFlow.js fed with a lot of images kindly provided by the National Museum Zurich. The app is in pre-alpha stage. Data Code Demo Team Some data ramblers concept, dev, design, data, expert
<h2 class="sectionedit1 page-header pb-3 mb-4 mt-5" id="find_me_an_exhibit">Find Me an Exhibit</h2> <div class="level2"> <p> Are you ready to take up the challenge? Film categories of objects in the exhibition History of Switzerland running against the clock. </p> <p> The app displays one of several categories of exhibits that can be found in the exhibition (like cloths, paintings or clocks). Your job is to find a matching exhibit as quick as possible. You don't have much time, so hurry up! </p> <p> Best played on portable devices. <img alt=";-)" class="icon" src="/wiki/lib/images/smileys/icon_wink.gif"/> </p> <p> The frontend of the app is based on the game <a class="urlextern" href="https://github.com/google/emoji-scavenger-hunt" rel="nofollow" title="https://github.com/google/emoji-scavenger-hunt">Emoji Scavenger Hunt</a>, the model is built with <a class="urlextern" href="https://js.tensorflow.org/" rel="nofollow" title="https://js.tensorflow.org/">TensorFlow.js</a> fed with a <a class="urlextern" href="https://opendata.swiss/en/dataset?q=&organization=schweizerisches-nationalmuseum-snm" rel="nofollow" title="https://opendata.swiss/en/dataset?q=&organization=schweizerisches-nationalmuseum-snm">lot of images</a> kindly provided by the <a class="urlextern" href="https://www.nationalmuseum.ch/e/" rel="nofollow" title="https://www.nationalmuseum.ch/e/">National Museum Zurich</a>. The app is in pre-alpha stage. </p> </div> <h2 class="sectionedit2 page-header pb-3 mb-4 mt-5" id="data">Data</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> <a class="urlextern" href="https://github.com/dataramblers/glamhack18" rel="nofollow" title="https://github.com/dataramblers/glamhack18">Code</a></div> </li> <li class="level1"><div class="li"> <a class="urlextern" href="https://game.annotat.net" rel="nofollow" title="https://game.annotat.net">Demo</a></div> </li> </ul> </div> <h2 class="sectionedit3 page-header pb-3 mb-4 mt-5" id="team">Team</h2> <div class="level2"> <ul class=" fix-media-list-overlap"> <li class="level1"><div class="li"> Some data ramblers</div> </li> </ul> <div class="tags"><span> <a class="wikilink1 tag label label-default mx-1" href="/wiki/status:concept?do=showtag&tag=status%3Aconcept" rel="tag" title="status:concept"><span class="iconify" data-icon="mdi:tag-text-outline"></span> concept</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:dev?do=showtag&tag=needs%3Adev" rel="tag" title="needs:dev"><span class="iconify" data-icon="mdi:tag-text-outline"></span> dev</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:design?do=showtag&tag=needs%3Adesign" rel="tag" title="needs:design"><span class="iconify" data-icon="mdi:tag-text-outline"></span> design</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:data?do=showtag&tag=needs%3Adata" rel="tag" title="needs:data"><span class="iconify" data-icon="mdi:tag-text-outline"></span> data</a>, <a class="wikilink1 tag label label-default mx-1" href="/wiki/needs:expert?do=showtag&tag=needs%3Aexpert" rel="tag" title="needs:expert"><span class="iconify" data-icon="mdi:tag-text-outline"></span> expert</a> </span></div> </div>

Statistics

Average successful run time: 10 minutes

Total run time: 21 minutes

Total cpu time used: half a minute

Total disk space used: 1.84 MB

History

  • Manually ran revision b1be6554 and completed successfully .
    227 records added in the database
  • Manually ran revision 6267e4a1 and failed .
    228 records removed in the database
  • Manually ran revision 6267e4a1 and completed successfully .
    228 records added in the database
    17 pages scraped
  • Manually ran revision b9ec3609 and failed .
  • Manually ran revision c5f803cc and failed .
  • Created on morph.io

Scraper code

dokuwiki-projects