{"id":367,"date":"2017-03-07T10:17:30","date_gmt":"2017-03-07T10:17:30","guid":{"rendered":"http:\/\/deberker.com\/archy\/?p=367"},"modified":"2021-11-06T15:55:52","modified_gmt":"2021-11-06T15:55:52","slug":"building-a-burns-bot-1-markov-models-for-poetry-generation","status":"publish","type":"post","link":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/","title":{"rendered":"Building a Burns bot #1: Markov models for poetry generation"},"content":{"rendered":"\n<figure class=\"wp-block-image size-full is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" src=\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2020\/05\/image.jpeg?resize=287%2C370\" alt=\"\" class=\"wp-image-645\" width=\"287\" height=\"370\"\/><figcaption>Rabbie Burns, a.k.a the Bard of Ayrshire<\/figcaption><\/figure>\n\n\n<p><em>Originally published on&nbsp;the <a href=\"http:\/\/blog.asidatascience.com\/teaching-a-computer-to-write-poetry\/\">ASI Blog&nbsp;<\/a>.<\/em><\/p>\n<p><a href=\"http:\/\/www.robertburns.org\/suppers\/\">Robert (or Rabbie) Burns<\/a> was a Scottish poet whose corpus includes \u2018An Ode to A Haggis\u2019 and the New Year\u2019s Eve favourite \u2018Auld Lang Syne\u2019. Each year on January 25th people throughout the UK come together to celebrate his life, for a night that typically revolves around two of Scotland\u2019s most famous culinary exports: haggis and whisky. This year, the ASI team decided to celebrate Burn\u2019s night in a creative manner: building a robot to produce Burns-esque poetry*.<\/p>\n<p><b>Producing sentences with machines<\/b><\/p>\n<p><span style=\"font-weight: 400;\">How can a machine generate meaningful text? You can think of a couple of ways to approach this problem. The first strategy is to start with a big collection of words &#8211; say all of the words that Burns ever used &#8211; and train the algorithm to pick collections of words that form plausible sentences. The second, more fine-grained option, is to generate words letter by letter. To do this, we have a collection of possible letters (A-Z), and we train the algorithm to select letters that produce real words. We do this over and over, and we end up with sentences. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">In both cases, the problem boils down to one of probability. We want the machine to generate words and sentences which seem <\/span><i><span style=\"font-weight: 400;\">plausible<\/span><\/i><span style=\"font-weight: 400;\">, which is another way of saying that they have <\/span><i><span style=\"font-weight: 400;\">high probability<\/span><\/i><span style=\"font-weight: 400;\">. For example, the sentence \u2018I ate some haggis\u2019 has a higher probability than the sentence \u2018carpet grab bag leg\u2019. Similarly, the combination of letters arranged as \u2018the\u2019 is very probable; the combination \u2018xyo\u2019 is not very probable.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">We can go one step further: in order to generate sentences that sound like they were written by Robert Burns, we train upon sentences written by Burns. The algorithm thus learns to generate sentences that have a high probability in \u2018Burns language\u2019 which <\/span><a href=\"http:\/\/www.robertburns.org\/works\/496.shtml\"><span style=\"font-weight: 400;\">as you\u2019ll apreciate if you\u2019ve read any of his poems<\/span><\/a><span style=\"font-weight: 400;\">, is not quite the same as normal English. In this manner, we could teach a machine to write like Burns, or Wikipedia, or Shakespeare.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">So how do we actually <\/span><i><span style=\"font-weight: 400;\">do <\/span><\/i><span style=\"font-weight: 400;\">this? We need to specify a model by which the bot can learn what sentences are probable, and then produce them. Here we describe an approach using Markov chains (don\u2019t worry, we unpack that term below); in a later blog post we will discuss how neural networks can provide a powerful alternative approach.<\/span><\/p>\n<p><b>Markov models for sentence generation<\/b><\/p>\n<p><span style=\"font-weight: 400;\">We used a <\/span><a href=\"https:\/\/blog.codinghorror.com\/markov-and-you\/\"><span style=\"font-weight: 400;\">Markov chain model<\/span><\/a><span style=\"font-weight: 400;\"> to generate sentences in a word-by-word fashion. Despite being rather intimidatingly named, Markov models capture a simple idea: that you can work out the likely next word given the past few words. These past few words constitute a <\/span><i><span style=\"font-weight: 400;\">state<\/span><\/i><span style=\"font-weight: 400;\">. The model is trained by splitting up the training corpus into these \u2018states\u2019, and for each state noting what the likely next word is. By doing this repeatedly, we can generate sequences of plausible words.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">For example, let\u2019s assume we trawl our text and find that we have the phrases \u2018the man jumped\u2019, and \u2018the man cooked\u2019. If we define our state as \u2018the man\u2019, we can quickly see that the <\/span><i><span style=\"font-weight: 400;\">only possible<\/span><\/i><span style=\"font-weight: 400;\"> words are \u2018jumped\u2019 and \u2018cooked\u2019, which are each 50% likely. Other words, like \u2018banana\u2019 and \u2018woman\u2019 are impossible (that is, p=0). Note also, that other words which might be plausible &#8211; &nbsp;like \u2018ran\u2019 or \u2018awoke\u2019- are also assigned p=0, because they didn\u2019t appear in the training material. The model doesn\u2019t know anything apart from what it saw in the data &#8211; no knowledge of semantics, or syntax, or grammar. It\u2019s a rather ignorant model, but it still does a surprisingly good job! If you\u2019d like to have a go at generating your own Markov model, <\/span><a href=\"https:\/\/de-code.github.io\/markovy-online\/\"><span style=\"font-weight: 400;\">you can do so here<\/span><\/a><span style=\"font-weight: 400;\"> courtesy of Daniel Ecer.<\/span><\/p>\n<p><b>Generating poetry with a Markov model<\/b><\/p>\n<p><span style=\"font-weight: 400;\">The process of generating sequences of words is visualised below. To start our sentence, we pick a random state &#8211; \u2018the man\u2019- from the corpus. We then perform a <\/span><i><span style=\"font-weight: 400;\">random walk<\/span><\/i><span style=\"font-weight: 400;\"> through the model. At each step, we choose from the likely next words according to their probabilities. So if a word is 90% likely to follow from a given state, we pick it 90% of the time in our random walk. Here \u2018jumped\u2019 and \u2018cooked\u2019 are equiprobable, and we end up choosing \u2018cooked\u2019. This gives us our next state -\u2019 man cooked\u2019- and the process begins again, up to some specified sentence length. For this state, we\u2019ve coloured the possible choices by their probability, with more opaque corresponding to higher probability. You can see that we actually ended up selecting the second most probable option (pork), generating the new state \u2018cooked pork\u2019.<\/span><\/p>\n<figure id=\"attachment_368\" aria-describedby=\"caption-attachment-368\" style=\"width: 580px\" class=\"wp-caption alignleft\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"wp-image-368 size-medium\" src=\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/GroupMarkovModel2.png?resize=580%2C181\" alt=\"\" width=\"580\" height=\"181\" srcset=\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/GroupMarkovModel2.png?resize=580%2C181&amp;ssl=1 580w, https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/GroupMarkovModel2.png?resize=768%2C240&amp;ssl=1 768w, https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/GroupMarkovModel2.png?w=830&amp;ssl=1 830w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><figcaption id=\"caption-attachment-368\" class=\"wp-caption-text\">Generating sequences of words using a Markov chain model. From a random starting stage, we iteratively generate a chain of plausible words. In the above, underlining defines states, red denotes a candidate word generated by the model, and opacity corresponds to probability.<\/figcaption><\/figure>\n<p><span style=\"font-weight: 400;\">We implemented this in Python using a neat open-source package called <\/span><a href=\"https:\/\/github.com\/jsvine\/markovify\"><span style=\"font-weight: 400;\">Markovify<\/span><\/a><span style=\"font-weight: 400;\">. Our corpus was provided by the impeccable <\/span><a href=\"https:\/\/www.linkedin.com\/in\/alberto-favaro-404055121\"><span style=\"font-weight: 400;\">Alberto Favaro<\/span><\/a><span style=\"font-weight: 400;\">, who scraped it from <\/span><a href=\"http:\/\/www.scottishpoetrylibrary.org.uk\/poetry\/poets\/robert-burns\"><span style=\"font-weight: 400;\">here<\/span><\/a><span style=\"font-weight: 400;\">. Using a state size of 3, we found that we could produce rather nice poems:<\/span><\/p>\n<p style=\"padding-left: 30px;\"><em><span style=\"font-weight: 400;\">But I look to the North; But what is a watery grave?<\/span><\/em><\/p>\n<p style=\"padding-left: 30px;\"><em><span style=\"font-weight: 400;\">Wha will crack to me my lovely maid.<\/span><\/em><\/p>\n<p style=\"padding-left: 30px;\"><em><span style=\"font-weight: 400;\">Chorus.-Carle, an the King come.<\/span><\/em><\/p>\n<p style=\"padding-left: 30px;\"><em><span style=\"font-weight: 400;\">What says she my dear, my native soil!<\/span><\/em><\/p>\n<p><span style=\"font-weight: 400;\">By specifying the first word of the sequence, one can also produce acrostics, spelling out words with the first letter of each word:<\/span><\/p>\n<p style=\"padding-left: 30px;\"><em><span style=\"font-weight: 400;\">And here&#8217;s the flower that I loe best, <\/span><\/em><\/p>\n<p style=\"padding-left: 30px;\"><em><span style=\"font-weight: 400;\">So may no ruffian-feeling in my breast, I can feel, by its throbbings<\/span><\/em><\/p>\n<p style=\"padding-left: 30px;\"><em><span style=\"font-weight: 400;\">In vain auld age his body batters, In vain the burns cam down like waters<\/span><\/em><\/p>\n<p><b>Next steps<\/b><\/p>\n<p><span style=\"font-weight: 400;\">The final step was to place code into a <\/span><a href=\"https:\/\/api.slack.com\/bot-users\"><span style=\"font-weight: 400;\">Slackbot<\/span><\/a><span style=\"font-weight: 400;\">, such that we could integrate it directly into ASI\u2019s slack channel. With the <\/span><a href=\"https:\/\/www.fullstackpython.com\/blog\/build-first-slack-bot-python.html\"><span style=\"font-weight: 400;\">help of a nice guide<\/span><\/a><span style=\"font-weight: 400;\"> and a bit of hacking, we ended up with our very own Burns bot, providing novel poetry on-demand.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Over the next couple of posts, we\u2019ll unpack how we packaged our algorithm into a working bot (and give you the opportunity to try out the Burns bot for yourself), and discuss more sophisticated approaches to language generation using a flavour of neural network utilising a Long Short Term Memory (LSTM) units. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">* Being civilised and respectful types, we also drank some whisky<\/span><\/p>\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>How to generate synthetic poetry with the Markov models and a wee dram of whisky,<\/p>\n","protected":false},"author":1,"featured_media":1144,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"hide_page_title":"","_coblocks_attr":"","_coblocks_dimensions":"","_coblocks_responsive_height":"","_coblocks_accordion_ie_support":"","_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2},"jetpack_post_was_ever_published":false},"categories":[1],"tags":[],"class_list":["post-367","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.9 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Building a Burns bot #1: Markov models for poetry generation - Archy de Berker<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Building a Burns bot #1: Markov models for poetry generation - Archy de Berker\" \/>\n<meta property=\"og:description\" content=\"How to generate synthetic poetry with the Markov models and a wee dram of whisky,\" \/>\n<meta property=\"og:url\" content=\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/\" \/>\n<meta property=\"og:site_name\" content=\"Archy de Berker\" \/>\n<meta property=\"article:published_time\" content=\"2017-03-07T10:17:30+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-11-06T15:55:52+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1\" \/>\n\t<meta property=\"og:image:width\" content=\"2560\" \/>\n\t<meta property=\"og:image:height\" content=\"1922\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"archy\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@archydeb\" \/>\n<meta name=\"twitter:site\" content=\"@archydeb\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"archy\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/\"},\"author\":{\"name\":\"archy\",\"@id\":\"https:\/\/deberker.com\/archy\/#\/schema\/person\/01cf8dd0f94a4ba124b26eeeeb59e67d\"},\"headline\":\"Building a Burns bot #1: Markov models for poetry generation\",\"datePublished\":\"2017-03-07T10:17:30+00:00\",\"dateModified\":\"2021-11-06T15:55:52+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/\"},\"wordCount\":1101,\"publisher\":{\"@id\":\"https:\/\/deberker.com\/archy\/#\/schema\/person\/01cf8dd0f94a4ba124b26eeeeb59e67d\"},\"image\":{\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1\",\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/\",\"url\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/\",\"name\":\"Building a Burns bot #1: Markov models for poetry generation - Archy de Berker\",\"isPartOf\":{\"@id\":\"https:\/\/deberker.com\/archy\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1\",\"datePublished\":\"2017-03-07T10:17:30+00:00\",\"dateModified\":\"2021-11-06T15:55:52+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#primaryimage\",\"url\":\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1\",\"contentUrl\":\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1\",\"width\":2560,\"height\":1922,\"caption\":\"Photo by Aaron Burden on Unsplash\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/deberker.com\/archy\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Building a Burns bot #1: Markov models for poetry generation\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/deberker.com\/archy\/#website\",\"url\":\"https:\/\/deberker.com\/archy\/\",\"name\":\"Archy de Berker\",\"description\":\"Building things with data\",\"publisher\":{\"@id\":\"https:\/\/deberker.com\/archy\/#\/schema\/person\/01cf8dd0f94a4ba124b26eeeeb59e67d\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/deberker.com\/archy\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":[\"Person\",\"Organization\"],\"@id\":\"https:\/\/deberker.com\/archy\/#\/schema\/person\/01cf8dd0f94a4ba124b26eeeeb59e67d\",\"name\":\"archy\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/deberker.com\/archy\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2021\/09\/freelance-logo.png?fit=359%2C311&ssl=1\",\"contentUrl\":\"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2021\/09\/freelance-logo.png?fit=359%2C311&ssl=1\",\"width\":359,\"height\":311,\"caption\":\"archy\"},\"logo\":{\"@id\":\"https:\/\/deberker.com\/archy\/#\/schema\/person\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/archydeb\"],\"url\":\"https:\/\/deberker.com\/archy\/author\/archy\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Building a Burns bot #1: Markov models for poetry generation - Archy de Berker","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/","og_locale":"en_US","og_type":"article","og_title":"Building a Burns bot #1: Markov models for poetry generation - Archy de Berker","og_description":"How to generate synthetic poetry with the Markov models and a wee dram of whisky,","og_url":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/","og_site_name":"Archy de Berker","article_published_time":"2017-03-07T10:17:30+00:00","article_modified_time":"2021-11-06T15:55:52+00:00","og_image":[{"width":2560,"height":1922,"url":"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1","type":"image\/jpeg"}],"author":"archy","twitter_card":"summary_large_image","twitter_creator":"@archydeb","twitter_site":"@archydeb","twitter_misc":{"Written by":"archy","Est. reading time":"6 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#article","isPartOf":{"@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/"},"author":{"name":"archy","@id":"https:\/\/deberker.com\/archy\/#\/schema\/person\/01cf8dd0f94a4ba124b26eeeeb59e67d"},"headline":"Building a Burns bot #1: Markov models for poetry generation","datePublished":"2017-03-07T10:17:30+00:00","dateModified":"2021-11-06T15:55:52+00:00","mainEntityOfPage":{"@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/"},"wordCount":1101,"publisher":{"@id":"https:\/\/deberker.com\/archy\/#\/schema\/person\/01cf8dd0f94a4ba124b26eeeeb59e67d"},"image":{"@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#primaryimage"},"thumbnailUrl":"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1","inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/","url":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/","name":"Building a Burns bot #1: Markov models for poetry generation - Archy de Berker","isPartOf":{"@id":"https:\/\/deberker.com\/archy\/#website"},"primaryImageOfPage":{"@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#primaryimage"},"image":{"@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#primaryimage"},"thumbnailUrl":"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1","datePublished":"2017-03-07T10:17:30+00:00","dateModified":"2021-11-06T15:55:52+00:00","breadcrumb":{"@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#primaryimage","url":"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1","contentUrl":"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1","width":2560,"height":1922,"caption":"Photo by Aaron Burden on Unsplash"},{"@type":"BreadcrumbList","@id":"https:\/\/deberker.com\/archy\/building-a-burns-bot-1-markov-models-for-poetry-generation\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/deberker.com\/archy\/"},{"@type":"ListItem","position":2,"name":"Building a Burns bot #1: Markov models for poetry generation"}]},{"@type":"WebSite","@id":"https:\/\/deberker.com\/archy\/#website","url":"https:\/\/deberker.com\/archy\/","name":"Archy de Berker","description":"Building things with data","publisher":{"@id":"https:\/\/deberker.com\/archy\/#\/schema\/person\/01cf8dd0f94a4ba124b26eeeeb59e67d"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/deberker.com\/archy\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":["Person","Organization"],"@id":"https:\/\/deberker.com\/archy\/#\/schema\/person\/01cf8dd0f94a4ba124b26eeeeb59e67d","name":"archy","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/deberker.com\/archy\/#\/schema\/person\/image\/","url":"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2021\/09\/freelance-logo.png?fit=359%2C311&ssl=1","contentUrl":"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2021\/09\/freelance-logo.png?fit=359%2C311&ssl=1","width":359,"height":311,"caption":"archy"},"logo":{"@id":"https:\/\/deberker.com\/archy\/#\/schema\/person\/image\/"},"sameAs":["https:\/\/x.com\/archydeb"],"url":"https:\/\/deberker.com\/archy\/author\/archy\/"}]}},"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/deberker.com\/archy\/wp-content\/uploads\/2017\/03\/aaron-burden-y02jEX_B0O0-unsplash-1-scaled.jpg?fit=2560%2C1922&ssl=1","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p4cGwe-5V","jetpack-related-posts":[],"post_mailing_queue_ids":[],"_links":{"self":[{"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/posts\/367","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/comments?post=367"}],"version-history":[{"count":5,"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/posts\/367\/revisions"}],"predecessor-version":[{"id":1148,"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/posts\/367\/revisions\/1148"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/media\/1144"}],"wp:attachment":[{"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/media?parent=367"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/categories?post=367"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/deberker.com\/archy\/wp-json\/wp\/v2\/tags?post=367"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}