<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[matt.guide]]></title><description><![CDATA[Travel. Coffee. Data Science.]]></description><link>https://matt.guide/</link><generator>Ghost 2.9</generator><lastBuildDate>Thu, 01 Apr 2021 20:48:36 GMT</lastBuildDate><atom:link href="https://matt.guide/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[COVID-19 Response App]]></title><description><![CDATA[COVID-19 is part of everyday life, and it is easy to feel like this is happening and we have no control. However, we do have our part to play as non-front liners. For starters, CDC COVID-19 [https://www.cdc.gov/coronavirus/2019-nCoV/index.html] website has many helpful resources including one of the current most powerful tools:self-isolation. Aside from that though, the federal response severely lacks in data and top-down guidance. To that end, Health innovation companies [https://www.fronti]]></description><link>https://matt.guide/covid-19/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151ba7</guid><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Mon, 13 Apr 2020 09:25:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1585282284319-e38fb6c29dd1?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1585282284319-e38fb6c29dd1?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="COVID-19 Response App"/><p>COVID-19 is part of everyday life, and it is easy to feel like this is happening and we have no control. However, we do have our part to play as non-front liners. For starters, <a href="https://www.cdc.gov/coronavirus/2019-nCoV/index.html">CDC COVID-19</a> website has many helpful resources including one of the current most powerful tools: <strong>self-isolation</strong>. Aside from that though, the federal response severely lacks in data and top-down guidance. To that end, <a href="https://www.frontiers.health/frontiers-health-ecosystem-companies-fighting-back-covid-19-outbreak-2/">Health innovation companies</a> are playing a key role in fighting the outbreak. This is where we can also make a difference.</p><p>I have written a couple posts about <a href="https://blog.asha.health/pivot-covid19/">Pivoting for the COVID-19 Response</a>. The primary work has been coordinating with practitioners to help them with pressing issues such a hospital capacity planning, resource management, and preparing for difficult conversations. Talking to subject matter experts (SMEs) in the field is where the rubber meets the road for creating the product.</p><h2 id="pivot-for-a-crisis">Pivot for a crisis</h2><p>We are well into the response and have seen some wonderful apps come out and communicated across the board. These new apps do require us to find our place in the innovation landscape, but we have been working with great resources to stay on top of it: </p><ul><li>Keep in touch with product owners. We have been working closely with the people who are at the forefront--the business owners--the doctors. </li><li>Utilize and contribute to open source projects. There are many great tools and capabilities being built by brilliant developers. We try to build on them to reduce redundant effort on both our sides. Collaborative work is easy with tools such as GitHub and Confluence, and collaborating is essential in such a fast-changing and technically complex problem.</li><li>Data</li></ul><h3 id="customizing-hospital-capacity-curve">Customizing hospital capacity curve</h3><p>I came across this highly customizable capacity curve build in <a href="https://d3js.org/">D3.js</a>: <a href="http://gabgoh.github.io/COVID/">Epidemic calculator</a> (and the <a href="https://github.com/gabgoh/epcalc">epicalc repo</a>). I think this is a great example of a lightweight application build in D3 using the <a href="https://svelte.dev/">Svelte deployment framework</a>. There was some great research put into the app, which is referenced at the bottom of the page. </p><p>Here are a few ways I could see elevating this code to make it more useful:</p><ol><li>An additional drop-down box for state/county that pre-populates a few of the fields with population per region. This would allow to feed data and customize for regions. This can be pre-populated with data from the <a href="https://docs.google.com/spreadsheets/d/1XUVyZF3X_4m72ztFnXZFvDKn5Yys1aKgu2Zmefd7wVo/edit#gid=1576394115">Harvard hospital capacity gsheet</a>.</li><li>An adjustable horizontal like that represents "max hospital capacity". This would enable an obvious target to remain under for any given region</li></ol><h3 id="questionnaire-app">Questionnaire App</h3><p>We are designing a <a href="https://blog.asha.health/covid-19-response-app/">COVID-19 Response Quesionnaire</a> and have a write up with more details.</p><h2 id="flexible-platforms">Flexible platforms</h2><p>Platforms that allow flexibility are an amazing for a fast-evolving, scattered effort like this one. These platforms enable innovate solutions to be built and deployed with no or little cost and scaled to meet significant demand. We are building our Questionnaire App using the following components.</p><p><a href="https://firebase.google.com/"><strong>Firebase (GCP)</strong></a> - A suite of Google Cloud Platform tools that enable end-to-end app development and deployment. Feature development components include iOS, Android, and the web via Angular. We are especially interested in using Google Analytics and the ability to do cloud messaging.</p><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/Annotation-2020-05-13-100148.png" class="kg-image" alt="COVID-19 Response App" loading="lazy"/></figure><p><strong><a href="https://www.jotform.com/">Jotform</a></strong> - Since we are building a questionnaire, we needed to rapidly prototype and do user testing. Jotform has a clean interface and has usseful features such as business rules and many integrations, which I will discuss next!</p><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/Annotation-2020-05-13-100547.png" class="kg-image" alt="COVID-19 Response App" loading="lazy"/></figure><p><a href="https://www.twilio.com/">Twilio</a> - One critical aspect was the ability to reach out to patients to understand the outcomes of the decision made. We wanted to keep this minimally invasive for patients and chose to use SMS messaging dierctly to their phone number. Twilio provides automated services for this and more. There is a free tier for demoing, but we did need to put down a nominal amount of money to get it running for our prototype and user testing.</p><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/Annotation-2020-05-13-100528.png" class="kg-image" alt="COVID-19 Response App" loading="lazy"/></figure><p><a href="https://zapier.com/home">Zapier</a> - We needed to integrate many components to fit the rapidly changing needs of our users. Zapier provides a slick interface for creating "zaps" which are triggers from one app to another for a multitude of platforms and applications. We were able to quickly tie our prototype form in Jotform to customer contact platform, Twilio, to message patients. There is a free tier of 1000 "zaps", so we can do quite a bit of testing and initial deployment with it before we even need to enter payment information.</p><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/Annotation-2020-05-13-100613.png" class="kg-image" alt="COVID-19 Response App" loading="lazy"/></figure>]]></content:encoded></item><item><title><![CDATA[Multi-Tasking Rig - Gaming, NAS, Docker, Cloud Server]]></title><description><![CDATA[2018-2019 rig pic, internalsMy rig has been updated with new guts since the first All-in-One Gaming, NAS, Docker with UnRaid [/all-in-one.html] post. I limit my updates to every few years so that I make at least a couple major generation jumps. And when I do this I upgrade 3 core components: CPU, RAM, and Motherboard. These 3 components are critically connected when the upgrade path is major versions away: * Motherboard - nervous system, controlling all components * RAM (memory) - amount ]]></description><link>https://matt.guide/gaming-rig-2018-2019/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151ba6</guid><category><![CDATA[Data Science]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Sun, 17 Feb 2019 17:23:27 GMT</pubDate><media:content url="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/IMG_6373-edit-feature_o.jpg" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/IMG_6373-edit_o.jpg" class="kg-image" alt="Multi-Tasking Rig - Gaming, NAS, Docker, Cloud Server" loading="lazy"><figcaption>2018-2019 rig pic, internals</figcaption></img></figure><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/IMG_6373-edit-feature_o.jpg" alt="Multi-Tasking Rig - Gaming, NAS, Docker, Cloud Server"/><p>My rig has been updated with new guts since the first <a href="/all-in-one.html">All-in-One Gaming, NAS, Docker with UnRaid</a> post. I limit my updates to every few years so that I make at least a couple major generation jumps. And when I do this I upgrade 3 core components: CPU, RAM, and Motherboard. These 3 components are critically connected when the upgrade path is major versions away:</p><ul><li><strong>Motherboard</strong> - nervous system, controlling all components</li><li><strong>RAM</strong> (memory) - amount of information able to be processed at one time</li><li><strong>CPU</strong> (processor) - speed at which information can be processed</li></ul><p>While these base components are tied together and make up the principle functionality of the computer, there are a few others that are important. However, I update these components as new technology emerges and markets change:</p><ul><li><strong>GPU</strong> (graphics card) - brains that enable high-powered gaming, video editing, and in some cases analytics processing</li><li><strong>Storage</strong> (solid state drives, SSD, NVMe) - long-term data storage and access</li><li><strong>Storage, long-term</strong> (Hard Disk Drive, HDD) - becoming less important as SSDs become larger and cheaper; this is cheaper, slower storage for non-immediate files and backups</li><li><strong>Case </strong>- container for all components. Originally considered mostly for looks and then cooling, but this component can significantly impact cooling and performance!</li></ul><h2 id="reasons-for-the-update">Reasons for the update</h2><p>I needed an update for a few reasons:</p><ul><li>More CPU cores - Pin more dedicated CPU cores to the gaming instance and leave more for other instances</li><li>More RAM - Assign more RAM to gaming instance and have more left over for other applications</li><li>New technology - Faster Storage such as NVMe m.2 drives, built-in Wifi, faster RAM (DDR4)</li><li>Further expansions - Ability to add more storage and RAM for future needs</li></ul><h1 id="new-architecture-and-layout">New Architecture and Layout</h1><p>Laid out an update of the previous view. This is broken into the bare metal rig, UnRaid OS capabilities, and then downstream components and applications.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/unraid-gaming-v22.svg" class="kg-image" alt="Multi-Tasking Rig - Gaming, NAS, Docker, Cloud Server" loading="lazy"><figcaption>2018-2019 rig architecture</figcaption></img></figure><h1 id="what-did-we-go-with">What did we go with?</h1><p>Breakdown of the actual components.</p><h2 id="amd-ryzen-7-2700x-8-core-3-7-ghz-4-3-ghz-max-boost-">AMD RYZEN 7 2700X 8-Core, 3.7 GHz (4.3 GHz Max Boost) </h2><p>Intel still holds the top spot in game performance, due in large part to optimization for games to this architecture, but also due to better single-core performance. However, the pendulum is swinging back to AMD! I am jumping on the train with this model. </p><p>My reason for purchasing: <strong>More Cores!</strong> I wanted to focus on the relatively large number of cores this processor offers in a power-user, consumer format. I need to be able to power the variety of processes with the ability to specifically reserve some for my gaming VM. </p><h2 id="corsair-vengeance-lpx-32gb-2-x-16gb-288-pin-ddr4-sdram-ddr4-3200-pc4-25600-">CORSAIR Vengeance LPX 32GB (2 x 16GB) 288-Pin DDR4 SDRAM DDR4 3200 (PC4 25600) </h2><p>High-speed gaming version of their RAM. I wanted the high-performance for overclocking and be able to expect stability because this thing will be running 24x7.</p><p>My reason for purchasing: <strong>Speed and Capacity</strong>. I upgraded from the previous generation DD3, and the benefits of the additional speed are noticeable! Not to mention, upgrading to 32 GB offers the ability to share more of the RAM with other programs while dedicating a large chunk to the gaming VM.</p><h2 id="msi-x470-gaming-m7-ac-am4-amd-x470">MSI X470 GAMING M7 AC AM4 AMD X470 </h2><p>This motherboard offers overclocking capabilities and many-and-more capabilities and cuttin-edge features. The x470 was released specifically for the Ryzen 7, so it should have great stability and compatibility.</p><h2 id="corsair-hydro-series-h100i">CORSAIR HYDRO Series H100i</h2><p>A few months after assembling the computer, I decided to revisit the cooling. I was unhappy with the performance of the stock Ryzen cooler, which is a good stock cooler, but leaves quite a bit to be desired on performance. I did not think much of the AIO (all-in-one) water cooling option, but after digging into it I realized these do provide the best performance and cooling capability, save the Noctura coolers. However, the AIOs are more quiet for the same cooling capacity. I could not be happier with this. Temps went from 70 C with the stock cooler to 49 C with the AIO (at medium pump and fan speed) under load.</p><h2 id="silverstone-technology-rl06">Silverstone Technology RL06</h2><p>Along with the new cooler, I got a new case to better fit the components and cooler. I cannot describe how impressed I have been with this case. I have never put much thought in the ability of a case to cool components, but this case is a high-throughput beast! It pushes air like no other! On top of that, there magnetic dust filters on all the intakes, it is relatively compact while fitting all of my hard drives, and it nicely displays my new components.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/IMG_6371-edit_o.jpg" class="kg-image" alt="Multi-Tasking Rig - Gaming, NAS, Docker, Cloud Server" loading="lazy"><figcaption>2018-2019 rig pic, case</figcaption></img></figure>]]></content:encoded></item><item><title><![CDATA[In The Air - PDX to DEN]]></title><description><![CDATA[Some of the most beautiful landscape in the US is along the flight path from Portland, OR to Denver, CO. This trip in particular is gorg-mazing! (Yes, it's a word now from gorgeous and amazing.) During a clear morning after a fresh snowfall all the peaks are blanketed and exemplify the ruggedness of the land with craggy outcroppings in contrast to the powdery white. The sun in this early morning flight from 36k ft casts breathtaking shadows. Oh the things you will see Good morning Pass by th]]></description><link>https://matt.guide/in-the-air-pdx-to-den/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151ba5</guid><category><![CDATA[Travel]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Thu, 29 Nov 2018 11:25:40 GMT</pubDate><media:content url="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_064727-1.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_064727-1.jpg" alt="In The Air - PDX to DEN"/><p>Some of the most beautiful landscape in the US is along the flight path from Portland, OR to Denver, CO. This trip in particular is gorg-mazing! (Yes, it's a word now from gorgeous and amazing.) During a clear morning after a fresh snowfall all the peaks are blanketed and exemplify the ruggedness of the land with craggy outcroppings in contrast to the powdery white. The sun in this early morning flight from 36k ft casts breathtaking shadows.</p><!--kg-card-begin: markdown--><h1 id="ohthethingsyouwillsee">Oh the things you will see</h1> <h2 id="goodmorning">Good morning</h2> <p>Pass by the Columbia River gorge in the darkness of pre- Dawn, near the end of the gorge are rewarded with a few peaks coming into view With the morning light.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_060914-reo.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"><figcaption>Columbia River Gorge</figcaption></img></figure><!--kg-card-begin: markdown--><h2 id="wallawhat">Walla-What?</h2> <p>Wallowa-whitman National Forest is the first place we can see some mountainous glory!</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_061819.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_061956.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="boise">Boise</h2> <p>Boise National Forest does not disappoint, but need to get off the interstate to feel it on the ground</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_062012.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="sawtooth">Sawtooth!!</h2> <p>Sawtooth National Forest nearby to Boise continues to impress and earn it's name- look at those sharp peaks.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_062739.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_063019.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="thatsnomoon">That's no moon...</h2> <p>Craters of the Moon National Monument is an otherworldly site, especially from the air. Gosemer clouds and smokey haze obscure some of the view</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_063414-1.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="windpower">Wind power</h2> <p>We cross into a rolling Idaho lanscape that is dotted as far as the eye can see with Wind Turbines along the ridges of high hills.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card kg-width-wide"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_064416.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_064416-turbines.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"><figcaption>Zoom in on the wind turbines</figcaption></img></figure><!--kg-card-begin: markdown--><h2 id="grandtetons">Grand Tetons</h2> <p>Diverge slightly north to the Teton National Forest to see some amazing sites of the Grant Tetons, Jackson hole is on the lower side. Yellowstone is up in the distant.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_064727-1.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_064846.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card kg-width-full"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_064951.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="wyoming">Wyoming</h2> <p>The Bigger and Fitzpatrick wilderness in the distance over the desolate, cold WY landscape.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_065718.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_070707.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="intocolorado">Into Colorado</h2> <p>We then took a southeastern course to get into the commanding Rockies proper, dipping up to our necks into Medicine Bowe-Routt National Forest</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_072110.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="closertotherockies">Closer to the Rockies</h2> <p>On our way down into Denver we pass over the well known Rocky Mountain NP and the Roosevelt NF beyond, what a sight from 25k ft.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_072836.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_073006-1.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_073151-1.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="finalapproach">Final approach</h2> <p>And, of course, on approach to Denver, looking East on the great, flat shelf of eastern Colorado that goes into Nebraska and the Midwest.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_073258.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_073511.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181112_073707.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h1 id="arewethereyet">Are we there yet?</h1> <p>How did I know where we were on the flight? I use an offline hiking map (<a href="https://play.google.com/store/apps/details?id=com.mictale.gpsessentials&hl=en_US">GPS Essentials app</a>) and the GPS signal, which takes a few seconds to zero in but tells the whole story! <em>Window seat required</em></p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/Img_181112070844088-1.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/Img_181112071203740-1.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h1 id="returnflight">Return Flight</h1> <p>I got some more more amazing pics on the return flight that I also wanted to share! My biggest regret is that I had another North-facing window, where the South face would have given more another vantage, though the sun was a bit strong on that side.</p> <!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h2 id="backintothefold">Back into the fold</h2> <p>Going back into the mountains from Denver, I captured a great example of how the flat land turns into foothills and then further turns into the snow-covered mountains.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card kg-width-full"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_152354.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="flaminggorgeisonfire">Flaming Gorge is on fire!</h2> <p>The evening sun (golden hour!) really showcases Flaming Gorge National Recreation Area, just north of Ashley National Forest. The gorge is just on the lower side of the photos.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_155354-reo.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="utahfondlyremembered">Utah, fondly remembered</h2> <p>In the Unita-wasache-cache National Forest you can see maybe 200 miles north to the Tetons... Amazing mountains way in the distance.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_160324-reo.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="bearlakeutahoristhatidaho">Bear Lake, Utah or is that Idaho?</h2> <p>Bear Lake straddling the Utah, Idaho border with Mount Naomi Wilderness area to the West. You can see how the lake was dammed up on the southern side, so straight. And the Tetons towering way in the back still.<br> The Great Salt Lake would be on the other side of the plane, to the South, and I would just love to capture that.</br></p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_160800-reo.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="bamboomstraighttothemoon">Bam, boom, straight to the moon!</h2> <p>Pardon the misogynistic throwback reference to the Honeymooners. Craters of the moon the west-bound flight. A little less otherworldly in the late afternoon sun.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_162613-reo.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="sawtooooooth">Sawtooooooth!!</h2> <p>Sawtooth and Boisy National Forest again, we are a bit further South this time.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_163539.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="californiawildfiresfeltfarandwide">California wildfires felt far and wide</h2> <p>It's getting hazy and smokey as we pass further into Idaho. The California wildfires are showing their devastating force even up here. The Wallowa-Whitman National Forest and mountain tops are in the distance.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_165507.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="backtooregonhellomountadams">Back to Oregon, hello Mount Adams</h2> <p>By the time we come up on a parallel path to the Columbia River Gorge and begin decent, the smoke to the South is intense, I can see across the rows at an open window.</p> <p>Mount Adams is in clear, towering view, though!!</p> <p>Too bad Mnt St Helens and Mnt Rainier were obscured by clouds and haze.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_171749-reo.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_172337.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_171749-reo-adams.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h2 id="welcomebacktoportland">Welcome back to Portland</h2> <p>And a moody, heavy foggy CRG near PDX! That's actually the Fifford-Pinchot NF on the top side. Think it's raining down there?<br> Welcome to Portland!</br></p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_172725-reorient.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20181115_173618-reorient.jpg" class="kg-image" alt="In The Air - PDX to DEN" loading="lazy"/></figure><!--kg-card-begin: markdown--><h1 id="featurepic">Feature Pic</h1> <p>Taken by Matt Z on the East-bound trip over Idaho.</p> <!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Create open source Time Lapse Videos from photos]]></title><description><![CDATA[Sitting on the top of Antelope Island for some sunset viewing on a July evening makes for some great shots and videos]]></description><link>https://matt.guide/how-to-create-open-source-time-lapse-videos/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151ba4</guid><category><![CDATA[Travel]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Mon, 13 Aug 2018 20:33:47 GMT</pubDate><media:content url="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/IMG_4743.jpg" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h1 id="chooseyourlocation">Choose your location</h1> <img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/IMG_4743.jpg" alt="Create open source Time Lapse Videos from photos"/><p>The true depth of a beautiful sunset can be hard to capture in a single photo. There is a majesty and depth that I feel as I relax on a mountain top after a grueling evening hike. Reflecting on the twilight landscape as the air cools and the fauna become more active brings excited shivers down my spine. To bring that feeling a little closer to home long after the day has finished, I will set up the DSLR camera and shoot a series of shots to capture the splendor of the moment.</p> <p>These pictures were taken from <strong>Antelope Island</strong> near Salt Lake City, UT on a gorgeous July evening.<br> <img src="https://res.cloudinary.com/mattguide/image/upload/v1572376586/blog-images/IMG_4766-1_Custom_cs8yue.jpg" alt="Create open source Time Lapse Videos from photos" loading="lazy"/></br></p> <p>The key to doing this is taking a photo every 2-10 seconds (I lean towards one photo every 3 seconds) for as long as it takes for the sun to go behind the horizon and the clouds to finally extinguish the burning reds and purples.</p> <h2 id="basicequipmentiused">Basic equipment I used</h2> <ul> <li><strong>Camera</strong> - ok, duh... But a solid DSLR camera that has been focused and set to Manual mode will make this amazing.</li> <li><strong>Tripod</strong> or camera stand - to set the camera in a stationary position for the entire sequence of photos</li> <li><strong>Bulb</strong> - this is an external camera trigger so I can leave the camera attached to the tripod and take photos every X seconds until finished</li> <li><strong>External Timer</strong> - this can be used in place of a bulb and offers a built-in timer that can be set to take a photo every X seconds for however long it is set</li> </ul> <p>Cameras differ in their capabilities to shoot these photos, some offer <code>Interval Timers</code> directly in the software that you can set the required parameters and completely bypass the need for a bulb or external timer.</p> <p>Bring the dog to take those night shots:<br> <img src="https://res.cloudinary.com/mattguide/image/upload/v1572376587/blog-images/18-07-04-20-50-25-9141_Custom_rnjspy.jpg" alt="Create open source Time Lapse Videos from photos" loading="lazy"/></br></p> <h1 id="ffmpegtoturntimelapsephotosintovideo">FFMPEG to turn Time Lapse Photos into Video</h1> <p>Use the <strong>ffmpeg</strong> program - <a href="http://ffmpeg.org">http://ffmpeg.org</a>, for options descriptions <a href="http://ffmpeg.org/ffmpeg.html">http://ffmpeg.org/ffmpeg.html</a></p> <h2 id="argumentsexplained">Arguments explained</h2> <p>Reference this site for detailed notes - <a href="http://lukemiller.org/index.php/2014/10/ffmpeg-time-lapse-notes/">http://lukemiller.org/index.php/2014/10/ffmpeg-time-lapse-notes/</a></p> <blockquote> <p>Let’s break down those options:</p> <ul> <li><code>-framerate 30</code> = This is the input frame rate. It effectively says that you want each input image to appear on screen for 1/30 of a second. If you wanted each input image to appear on screen for a whole second, you’d use a -r 1. The default is 25. If you select a slower input frame rate than the output frame rate, ffmpeg will simply duplicate the input image appropriately so that it stays on screen for the desired amount of time.</li> <li>-i image%04d.jpg = ffmpeg will look for input images that match this filename pattern. As with the discussion above on filename patterns, the %04d tells ffmpeg to look for a 4-digit number in the filename. If you used %05d to generate your captured images, you’d want to use %05d here as well.</li> <li><code>-c:v libx264</code> = This specifies the video encoding to be used. libx264 is a good default choice for high quality and low file size, and should be readable by all sorts of video players. It may or may not work in some versions of PowerPoint on some operating systems, so check that before you get too excited about adding video into your presentations. It is possible to output as a wmv file or other formats, so search the web if you need a different format.</li> <li><code>-r 30</code> = specifies the framerate of the final movie. 30 frames per second is fairly standard, 24 fps is also common, and you could do something like 48 or 60 if you have special needs.</li> <li><code>outputfile.mp4</code> = the filename and format. As before, you can specify a different directory path to put the file into, otherwise it will appear in the current working directory. The .mp4 format should work on most web and mobile devices, and uploads to Youtube just fine.</li> </ul> </blockquote> <h2 id="myadditionalargument">My additional argument</h2> <p>Note, I also added an argument of my own to fix support for Win10 built-in players:</p> <ul> <li><code>-pix_fmt yuv420p</code> - Found this solution on StackOverflow. my libx264 videos were not playing natively in Windows using the Win10 "Movies and TV" app or even the "Windows Media Player". Subsequently, they would not play on OneDrive either. FFMPEG has a hilarious description for this argument: <ul> <li> <blockquote> <p><strong>Encoding for dumb players</strong>: You may need to use -vf format=yuv420p (or the alias -pix_fmt yuv420p) for your output to work in QuickTime and most other players. These players only support the YUV planar color space with 4:2:0 chroma subsampling for H.264 video. Otherwise, depending on your source, ffmpeg may output to a pixel format that may be incompatible with these players.</p> </blockquote> </li> </ul> </li> <li><code>-preset veryslow</code> - This tells the libx264 algorithm to create a more highly compressed video at the expense of taking a much longer time to encode. For me, the time to encode went up by about 3 times (~20 secs to 60 secs encoding time), but also did reduce the file size by about 30% (from 3.8 to 2.4 MB video file size). If you're not in a hurry, set it to <code>veryslow</code> or <code>slow</code> to save on space without losing fidelity.</li> </ul> <h1 id="mycode">My code</h1> <p>This will turn 195 photos into a 20 frame per second (FPS) HD (1080p) video from images in the <code>tlv</code> directory under the current folder (relative to the CMD prompt) from image starting with number IMG_4773.</p> <pre><code>ffmpeg -framerate 20 -r 20 -start_number 4773 -i tlv\IMG_%04d.JPG -s:v 1920x1080 -c:v libx264 -preset veryslow -pix_fmt yuv420p OUTPUT20fps_vslow_pixfmt.mp4 </code></pre> <p>Play around with the framerates to get a feel for how it changes your video.</p> <h1 id="resultsandthoughts">Results and Thoughts</h1> <p>30 FPS is good, but a bit fast. :-)</p> <p>Check out the video I created off of my photos:</p> <video width="100%" controls=""> <source src="https://onedrive.live.com/download?cid=533B2C0B88C57E81&resid=533B2C0B88C57E81%2161096&authkey=AMINA8h7N9oGz6Q" type="video/mp4"> Your browser does not support the video tag. Download video via this link - https://1drv.ms/v/s!AIF-xYgLLDtTg90o </source></video> <p>Perhaps not the most dramatic sunset video, but I am still working on the capture skills. I hope this helps to create your dramatic TLV capture!</p> <!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Public Use Healthcare Claims]]></title><description><![CDATA[It's hard to find healthcare claims data that looks like real payer financial claims. They are available via the CMS Synthetic Public Use File! Read on to find out more about these exceptionally flexible datasets.]]></description><link>https://matt.guide/public-use-healthcare-claims/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151ba3</guid><category><![CDATA[Data Science]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Fri, 06 Jul 2018 12:10:09 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1504868584819-f8e8b4b6d7e3?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://images.unsplash.com/photo-1504868584819-f8e8b4b6d7e3?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Public Use Healthcare Claims"/><p>Where do you go if you want high-quality healthcare payer claims data? Well, <strong>SynPUF!</strong> of course.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/cms_synpuf.png" class="kg-image" alt="Public Use Healthcare Claims" loading="lazy"/></figure><!--kg-card-begin: markdown--><h1 id="whatispublicusehealthcareclaims">What is public use healthcare claims?</h1> <p><a href="https://www.cms.gov/Research-Statistics-Data-and-Systems/Downloadable-Public-Use-Files/SynPUFs/DE_Syn_PUF.html">CMS 2008-2010 Data Entrepreneurs' Synthetic Public Use File</a> (SynPUF) provides a realistic set of Medicare claims data that is available in the public domain. CMS.gov lists a few purposes for this data:</p> <ol> <li>software and ETL pipeline development</li> <li>research and analytics practitioner training in complexity of claims data</li> <li>support data mining and advanced analytics activities.</li> </ol> <h1 id="whyusethisdata">Why use this data?</h1> <p>We are currently using this data to <strong>build solutions, pipelines, data models, and real-world analytics frameworks</strong>. This data looks so realistic that we can actually work with it, massage it, and work through issues that we would encounter with typical claims data inside a payer environment.</p> <p>In addition, I am using this data to <strong>generate synthetic Eletronic Medical/Health Records (EMR/EHR)</strong> in an XML format called HL7 (FHIR) via Python code. I will do a later write-up on this particular code process.</p> <h1 id="largecollectionofrealisticlookingdata">Large collection of realistic-looking data</h1> <p>There is a fairly "large" amount of claims data available in this set. It will defeinitely put your processes to task, but won't overwhelm.<br> Some highlights:</br></p> <ul> <li>~2.3 million unique <strong>beneficiary (member)</strong> entries in each 2008-2010</li> <li>~17 million combined <strong>inpatient and outpatient</strong> (IP and OP, respectively) medical claims for 2008-2010</li> <li>~111 million <strong>prescription drug events</strong> (PDE / Rx / pharmacy)</li> <li>~95 million <strong>carrier claims</strong></li> </ul> <p>The data is available in very managable pieces. In fact, each of these subject areas is available in its own file. On top of that, each is broken out into 20 samples that correspond to each other. In that way, you can pull Sample 01 from each beneficiary, IP, OP, etc. and get all the corresponding claims for those members.</p> <h1 id="gooddocumentation">Good documentation</h1> <p>CMS has provided very concise data definitions and dictionaries. I mean, it's good in general, but <em>very</em> good for CMS.gov.</p> <ul> <li><a href="https://www.cms.gov/Research-Statistics-Data-and-Systems/Downloadable-Public-Use-Files/SynPUFs/Downloads/SynPUF_DUG.pdf">Data Users Document</a> - This will give you the overview and counts for each of the tables, as well as basic analytical and use descriptions.</li> <li><a href="https://www.cms.gov/Research-Statistics-Data-and-Systems/Downloadable-Public-Use-Files/SynPUFs/Downloads/SynPUF_Codebook.pdf">Codebook</a> - Simple, extensive description, valid values, counts, etc. for each field in the tables. You will also find external references for some values.</li> </ul> <h1 id="caveats">Caveats</h1> <p>The data is not perfect, and you can read more about interpretation limitations in the Users Doc; many of these limitations revolve around the fact that the data is completely deidentified.</p> <p>However, there are some specific considerations I think are worth pointing out that make this limited:</p> <ul> <li>Data from 2008-2010 is a bit <strong>outdated</strong>. The healthcare landscape has changed since this time. Not least of which is the implementation of the Affordable Care Act (ACA), which had some wide-ranging effects on behavior and reporting.</li> <li>Medical claims (IP/OP) uses deprecated <strong>ICD-9-CM</strong> diagnosis codes. Since Oct 2015 ICD-10 has been required on medical claims data, so you will need to convert these. For that, check out <a href="https://www.cms.gov/Medicare/Coding/ICD10/2018-ICD-10-CM-and-GEMs.html">CMS's General Equivalency Model (GEM)</a>.</li> <li>Analytics taken with a grain of salt. This is discussed in detail in the document, but because this is a public use file, the deidentified nature of the data means not all correlations will be represented the same way in real world data.</li> </ul> <h1 id="otherresources">Other Resources</h1> <p>Here are some other resources that pair well with the SynPUF data for analytics and research purposes:</p> <ul> <li><a href="https://www.cms.gov/Medicare/Health-Plans/MedicareAdvtgSpecRateStats/Risk-Adjustors-Items/Risk2018.html">CMS ICD-10 to HCC 2018 crosswalks</a> - Used primarily for Risk Adjustment (RA). These codes will allow to roll-up to a high-level HCC from the diagnosis code and have associated Risk Adjustment Factor values.</li> <li><a href="https://www.nber.org/data/ssa-fips-state-county-crosswalk.html">SSA to FIPS state and county crosswalk</a> - SynPUF beneficiary file uses SSA county codes, which are not widely used for geo-spacial analysis. FIPS is more highly cross-compatible and will allow you to use it in more software packages.</li> </ul> <h1 id="justforfuntrythisusecase">Just for fun - try this use case</h1> <p>Ok, so I would also like to put out a challenge for someone who wants to get into Healthcare data. I think a good Data Analyst or Data Scientist should be able to work from end-to-end. So, download the datasets, load them, map them, and then try to answer some business problems:</p> <p><strong>Business Problems</strong>: We are a national health insurance payer.</p> <ol> <li>What diseases have the biggest impact on cost?</li> <li>How much should we expect to pay next year for a Female, 70-74 year old member with Heart Failure and Osteoporosis?</li> <li>What diseases were most common among members who died?</li> </ol> <p>Try it out with any stack you like. I would recommend the <a href="/docker-container-for-dedupe.html">matt.guide docker container</a> with Python and MariaDB (MySQL).</p> <h1 id="featurephoto">Feature Photo</h1> <p><a style="background-color:black;color:white;text-decoration:none;padding:4px 6px;font-family:-apple-system, BlinkMacSystemFont, "San Francisco", "Helvetica Neue", Helvetica, Ubuntu, Roboto, Noto, "Segoe UI", Arial, sans-serif;font-size:12px;font-weight:bold;line-height:1.2;display:inline-block;border-radius:3px" href="https://unsplash.com/@goumbik?utm_medium=referral&utm_campaign=photographer-credit&utm_content=creditBadge" target="_blank" rel="noopener noreferrer" title="Download free do whatever you want high-resolution photos from Lukas Blazek"><span style="display:inline-block;padding:2px 3px"><svg xmlns="http://www.w3.org/2000/svg" style="height:12px;width:auto;position:relative;vertical-align:middle;top:-2px;fill:white" viewBox="0 0 32 32"><title>unsplash-logo</title><path d="M10 9V0h12v9H10zm12 5h10v18H0V14h10v9h12v-9z"/></svg></span><span style="display:inline-block;padding:2px 3px">Lukas Blazek</span></a></p> <!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Home Coffee Brew - Hanza, Intelligentsia, PT's Kilimanjaro]]></title><description><![CDATA[Home Coffee Brew Series - Jan-Feb 2016]]></description><link>https://matt.guide/hanza-intelligensia-pts-kilimanjaro/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151ba2</guid><category><![CDATA[Coffee]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Wed, 06 Jun 2018 20:04:24 GMT</pubDate><media:content url="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/IMG_1039--3-.jpg" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/IMG_1039--3-.jpg" alt="Home Coffee Brew - Hanza, Intelligentsia, PT's Kilimanjaro"/><p>This is the first in a series I will be doing of home coffee brews. We travel around the US and look for a taste of some of the best indepedent, local roasters! The ones we really like at the shop we will purchase and bring home to brew. I keep a journal with the stats and some of my flavor and boquet notes.</p> <h1 id="brewsetup">Brew Setup</h1> <p>My home brewing setup consists of a Kalita Wave with a Kalita carafe using a hand-crank burr grinder/mill. All of the brews are made at a roughly 15:1 ratio (33g beans with 500 ML water) with a medium course grind and took about 3 minutes to brew, unless otherwise stated.</p> <h1 id="hanzaethiopiacelinga">Hanza, Ethiopia - Celinga</h1> <p><a href="https://www.hansacoffee.com/">Hanza</a> is a roaster based in Libertyville, IL (check out my Chicagoland post for more info and pics). It was a delight to go here for for a lunch pour-over on those cold Chicago winters. Brought this one home from work one week.</p> <p>Hanza, Ethiopia- Yirgacheffe; Gedeo Zone</p> <p><strong>Roasted</strong> 12/16/2016<br> <strong>Ratio</strong> 1:20 at 600 mL, 30 g grounds<br> <strong>Grind</strong> Medium-course<br> <strong>Process</strong> Natural<br> <strong>Tasting notes</strong> Sweet Berry, Jasmine, Lemon Curd</br></br></br></br></p> <p><strong>My take</strong> Light color, full flavor with a hint of the roast (not really sure what I was going for there!)</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20161225_101042--2---Small-.jpg" class="kg-image" alt="Home Coffee Brew - Hanza, Intelligentsia, PT's Kilimanjaro" loading="lazy"/></figure><h1 id="intelligentsia-columbia-santuario-red-bourbon-limited">Intelligentsia, Columbia - Santuario Red Bourbon, Limited</h1><p><a href="https://www.intelligentsiacoffee.com/">Intelligentsia</a> distributes around the US, but they have Chicago listed as one of their home bases, so I wanted to give them a try since I had a good brew of theirs at a small coffee at Glenco Roasters (which seems like false advertising since they do no roasting).</p><p>Intelligentsia, Santuario Red Bourbon, Columbia Limited Release; Direct Trade</p><p><strong>Roasted</strong> 1/18/2016<br><strong>Ratio</strong> 1:15 @ 33.5g w 500mL<br><strong>Grind</strong> Medium-course<br><strong>Tasting notes</strong> Cherry and Peach Cobbler</br></br></br></p><p><strong>My take</strong> Beans were pretty fresh, they bloomed like a muffin, roughly doubling in size to look like a mushroom cloud!</p><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20170128_095029--2---Small-.jpg" class="kg-image" alt="Home Coffee Brew - Hanza, Intelligentsia, PT's Kilimanjaro" loading="lazy"/></figure><h1 id="pt-s-el-salvador-kilimanjaro">PT's, El Salvador - Kilimanjaro</h1><p>PT's Coffee is a staple in our house. Ever since I first visited their shop in Topeka, KS I have been hooked. There is no one I have had that does it better. Some match them, but PT's practices (Direct Trade, working with farmers, etc.), excellent beans, and superior roast keep them on top for pour over coffee!</p><p>PT's Coffee, El Salvador - Kilimanjaro - Natural; Aida Batlle; Direct Trade</p><p><strong>Roasted</strong> Unknown<br><strong>Elevation</strong> 5600 ft<br><strong>Varietal</strong> Kenya & Bourbon<br><strong>Tasting Notes</strong> Chocolate, Grapefruit, Bluberry Jam</br></br></br></p><p><strong>My Take</strong> No notes on this one...</p><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20170119_100638--2---Small-.jpg" class="kg-image" alt="Home Coffee Brew - Hanza, Intelligentsia, PT's Kilimanjaro" loading="lazy"/></figure><h1 id="featured-image">Featured Image</h1><p>Roasted coffee bean dispensers from a small street-side roastery we visited in Athens Greece by Matt Zajack.</p>]]></content:encoded></item><item><title><![CDATA[Docker container for Python dedupe]]></title><description><![CDATA[The Entity Resolution series begins with a Docker container to get you Python, dedupe, and Libpostal all in one fell swoop!]]></description><link>https://matt.guide/docker-container-for-dedupe/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151ba0</guid><category><![CDATA[Data Science]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Wed, 23 May 2018 10:36:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1493946740644-2d8a1f1a6aff?ixlib=rb-0.3.5&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ&s=2a42149bdadb9808b62d87f71f08dab1" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://images.unsplash.com/photo-1493946740644-2d8a1f1a6aff?ixlib=rb-0.3.5&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ&s=2a42149bdadb9808b62d87f71f08dab1" alt="Docker container for Python dedupe"/><p>This will be the first post in a series I will be doing on an Entity Resolution solution I am putting together for Clarity Insights.</p> <p>I will give a full write-up of Entity Resolution in another post. For now, I am going to describe the platform so we can easily get to the <a href="https://dedupeio.github.io/dedupe-examples/docs/mysql_example.html">MySQL dedupe example</a>!</p> <p>This container is built around the <strong><a href="https://pypi.org/project/dedupe/">Dedupe Python</a></strong> library, built as an open-source API available at <a href="https://dedupe.io">Dedupe.io</a> by <a href="https://datamade.us">DataMade</a>.</p> <h1 id="requirementscuttingedgemachinelearningtools">Requirements - Cutting-edge Machine Learning Tools</h1> <p>The container has been built with capabilities beyond those minimum necessary for Dedupe to run. This is to allow a complete package that may be expanded.</p> <ul> <li><strong>Linux</strong> base installation (minimal Debian for Docker)</li> <li><strong>Python</strong> - <a href="https://www.anaconda.com">Anaconda</a> for Python 3 was used as the core install. This package offers a wide variety of Data Science-related packages, libraries, and tools.</li> <li><strong>Dedupe</strong> - Core application for deduplication. Read the <a href="https://docs.dedupe.io/en/latest/">dedupe documentation</a> for detailed information. This is a simple install with <code>pip install dedupe</code></li> <li><strong>Libpostal</strong> - Address parser application. This is an additional feature that makes a perfect pairing with Dedupe by splitting single address fields into component parts. Open source, free, and accurate! Read the <a href="https://mapzen.com/blog/inside-libpostal/">Libpostal write-up</a> and view the <a href="https://github.com/openvenues/pypostal">Libpostal Python Github</a>. This is <em>not</em> a Python package, but a stand-alone C-based application with Python bindings.</li> <li><strong>Jupyter</strong> allows you to execute code through a web interface. This is enabled by default, installed through the Conda package manager.</li> <li><strong>MySQL connectors</strong> - utilized PyMySQL for the connection, which is a drop-in repalcement for the outdated MySQLdb. <code>pip install PyMySQL</code>.</li> </ul> <h1 id="howtousethiscontainer">How to use this container</h1> <h2 id="installdocker">Install Docker</h2> <p>If you haven't done it, you will need to install <a href="https://www.docker.com">Docker</a>.</p> <p>Grab the container from (<a href="https://hub.docker.com/r/mattguide/python-dedupe/">https://hub.docker.com/r/mattguide/python-dedupe/</a>) with<br> <code>docker pull mattguide/python-dedupe</code> .</br></p> <p>This is a large download (over 3.2GB uncompressed on my system), so grab a coffee while you wait!</p> <h2 id="runthebarebonescontainer">Run the bare-bones container</h2> <p>Once you have the download, it is a quick path to running it. If you just <code>docker run</code> this package, you will find it immediately exists without any message. To avoid this, you will want to specify to run it in interactive and tty mode (-i and -t parameters) so you can log in or just keep it alive:<br> <code>docker run -i -t mattguide/python-dedupe --name="anaconda"</code></br></p> <h2 id="runwithajupyterwebfrontend">Run with a Jupyter web front-end</h2> <p>The bigger bang for your buck will be to start Jupyter immediately so you can access via a web interface. This will make it easy to hit the ground running:<br> <code>docker run -d --name="anaconda" --net="bridge" -p 8888:8888/tcp -v "/myvolume/":"/opt/notebooks/":rw -i -t mattguide/python-dedupe /bin/bash -c "/opt/conda/bin/jupyter notebook --notebook-dir=/opt/notebooks --ip='*' --port=8888 --no-browser --allow-root"</code></br></p> <h2 id="argumentsexplained">Arguments explained</h2> <p>There are a lot of arguments there read about them at the <a href="https://docs.docker.com/engine/reference/run/#general-form">Docker run docs</a>, but let's break them down:</p> <ul> <li><strong>-d</strong> - to run in detached mode</li> <li><strong>--name=</strong> - to give a friendly name to the container for easier reference</li> <li><strong>--net=</strong> - to tell the type of network to use, will be the same IP as source Docker install (probably your machin)</li> <li><strong>-p</strong> - port to bind relative to container, external:internal</li> <li><strong>-v</strong> - volume, use this to save internal container files somewhere permanantly, externally on your system (this is important, if you Terminate your Docker instance any data not in this location is destroyed!)</li> <li><strong>-i -t</strong> - run in interactive mode (stdin) and tty so you can SSH into it with a <code>docker exec</code> command</li> <li><strong>/bin/bash -c</strong> - this is a command internal to the container. It tells the container to run the bash command in quotes (starts Jupyter). Read the <a href="https://jupyter.readthedocs.io/en/latest/">Jupyter docs</a> for the commands relevant there.</li> </ul> <h1 id="additionalrequirementmariadbmysqlcontainer">Additional requirement - MariaDB (MySQL) container</h1> <p>Are we there yet?? The python-dedupe container will give you everything you need to run dedupe. However, to run the large-ish MySQL dedupe example you will need a MySQL instance. For this, I use MariaDB since it is the open source, community-driven drop-in replacement for MySQL.</p> <p>Side note: Did you know MySQL was bought by Oracle and no longer an open source project?? I did not until a year ago! MariaDB is mostly the same contributors who started the project to keep it free and open.</p> <h2 id="runthemariadbcontainer">Run the MariaDB container</h2> <p>Go to the <a href="https://store.docker.com/images/mariadb">Docker Store MariaDB</a> page for detailed information.<br> <code>docker pull mariadb</code></br></p> <p>Now, you will need only a few arguments to get it up and running. Similar to the dedupe container, you will want to map a volume so you can save your database if it is ever terminated:<br> <code>docker run --name="mariadb-dedupe" -v /my/custom:/etc/mysql/conf.d -e MYSQL_ROOT_PASSWORD=my-secret-pw -d -i -t mariadb</code></br></p> <p>You will be able to connect to the mariadb instance through the standard MySQL port <code>port 3306</code>.</p> <p>Bonus: In order to work with MySQL, you can directly SSH into the instance. Alternatively, you can install ANOTHER CONATINER! This one works natively with MySQL (it used to be PHPmyAdmin, do you remember that??):<br> <code>docker pull adminer</code><br> <code>docker run --name="adminer" adminer -p 8080:8080</code><br> Once it completely initializes, you will be able to access the MySQL admin screen from <code>http://host-ip:8080</code>.</br></br></br></p> <h1 id="whatsnext">What's next??</h1> <p>That should give you plenty to chew on for a while. If you want to get down to it, you should check out the <a href="https://dedupeio.github.io/dedupe-examples/docs/mysql_example.html"><strong>MySQL dedupe example</strong></a>. The dedupe developers, DataMade, have provided demo data and code for you to use and enjoy!<br> Note: I had to modify several parts of the both the init and example program code. For example, MariaDB uses strict settings, so some non-standard formats in the data will cause it to error out. I will leave that for you to investigate, but I will upload my code at some point here.</br></p> <p>Enjoy!</p> <h1 id="featuredimagecredits">Featured image credits</h1> <p>Featured image is from <a href="https://unsplash.com/@frankiefoto?utm_source=ghost&utm_medium=referral&utm_campaign=api-credit">Unsplash.com by frank mckenna</a></p> <!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Chicagoland Consulting]]></title><description><![CDATA[I grew up in Wisconsin, so my year consulting in Chicagoland (Northbrook, Glenview, Lake Forest) was some familiar landscape: flat, green, tree-covered, and swampy in the summer and a brown and white frozen tundra in the winter. However, there is a lot of urban sprawl up this direction. It's no surprise companies put large headquarter buildings out here--there are some big, glassy, gorgeous grounds running up 294. Of course, all of this suburban sprawl and pocketed commercialism can make for a r]]></description><link>https://matt.guide/chicagoland-consulting/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151b9f</guid><category><![CDATA[Travel]]></category><category><![CDATA[Coffee]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Sun, 13 May 2018 19:18:39 GMT</pubDate><media:content url="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20170216_175454--Large-.jpg" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20170216_175454--Large-.jpg" alt="Chicagoland Consulting"/><p>I grew up in Wisconsin, so my year consulting in Chicagoland (Northbrook, Glenview, Lake Forest) was some familiar landscape: flat, green, tree-covered, and swampy in the summer and a brown and white frozen tundra in the winter. However, there is <em>a lot</em> of urban sprawl up this direction. It's no surprise companies put large headquarter buildings out here--there are some big, glassy, gorgeous grounds running up 294.<br> Of course, all of this suburban sprawl and pocketed commercialism can make for a rather... bland and homogenous environment and foodscape at first glance. I am going to highlight some of the more interesting spots where I found a lot of joy during my workweek in the area.</br></p> <h1 id="favorites">Favorites</h1> <ul> <li>Coffee - <a href="https://www.hansacoffee.com">Hansa Coffee</a> in Libertyville, IL - roasted in house, thoughtfully sourced, and delicious pour over</li> <li>Lunch - <a href="http://www.picnicbasketfood.com">The Picnic Basket</a> in Libertyville, IL - great, fast made-to-order sandwhiches and sides</li> <li>Dinner - <a href="http://sgddubu.com/locations.php?id=5">So Gong Dong</a> in Glenview, IL - Korean comfort food BBQ, great hot stone plates and so many kimchi sides</li> <li>Run - <a href="http://fpdcc.com/preserves-and-trails/trail-descriptions/#des-plaines">Des Plaines Trail in the Forest Preserve of Cook County</a> - Over 20 miles of gravel/dirt trails in wooded areas, a great respite from car-centric suburbia</li> <li>Hotel - Delta Suites North Shore (Marriott) - modern, spacious rooms</li> </ul> <h1 id="coffeeandfood">Coffee and Food</h1> <h2 id="hansacoffeeandthepicnicbasket">Hansa Coffee and the Picnic Basket</h2> <p>When I would drive up from a late morning flight into ORD to Lake Forest, I would make a bee-line to Libertyville to grab a delicious pour-over at <strong><a href="https://www.hansacoffee.com">Hansa Coffee</a></strong> and walk over to <strong><a href="http://www.picnicbasketfood.com">The Picnic Basket</a></strong> for a sandwhich and salad.</p> <p><strong>Hansa</strong> roasts beans in what looks like an old auto service shop, just on the other side of the tracks in Libertyville. You know it's the right place when you see the giant COFFEE painted on the side of the building. They have a wide variety of roasts that are thoughtfully sourced from around the world. Especially on those cold, cold January mornings, a pour over here goes a long way.</p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20161220_095712--Small-.jpg" class="kg-image" alt="Chicagoland Consulting" loading="lazy"><figcaption>Hansa Roastery</figcaption></img></figure><p><strong>The Picnic Basket</strong> is an unassuming place that makes sandwhiches to order and has a line out the door during lunchtime most days, but they serve fast. My favorite is common to the Chicago and Milwaukee area - the Muffuletta! It's just a large roll with a couple types of deli meat and spicy pickled olives and veggies. I take it with a side of kale salad and enjoy it with the Hansa I got down the street.<br/></p><figure class="kg-card kg-image-card"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20171214_123929--Small-.jpg" class="kg-image" alt="Chicagoland Consulting" loading="lazy"/></figure><h2 id="honorable-mention-coffees">Honorable mention coffees</h2><ul><li><a href="http://chaube.coffee">Chaube Coffee</a> - A young Polish woman started the store next to the train station (chaube means coffee in Polski!). They roast their own beans in-house, but the coffee itself is pretty standard workday coffee on the pour over but to be fair the owner prefers espresso. My biggest complaint is that they changed their hours so they are not open during the lunchtime!</li><li><a href="https://www.newport-coffee.com">Newport Coffee House</a> - Lovely spot to read a newspaper and post on a blog. The place smells heavy of roast coffee when they are at work, but I find it enjoyable. The pour-over roasts are not so good, but when I need a pick-me-up, it's better than Starbucks without breaking a sweat.</li><li><a href="https://www.intelligentsiacoffee.com">Intelligensia Coffee</a> - Definitely not a micro-roaster, but having a headquarters in Chicago, this is a solid cup of coffee.</li></ul><h2 id="other-great-food-places">Other great food places</h2><ul><li><a href="http://www.milwalkytaco.com">Milwalky Taco</a> - Amazing food and tequila, I have way overdone it a few times here on both counts. Everything is done right with a little extra plus. I am not a big dessert guy, but save room for the horchata milkshake, everything from scratch in-house!</li><li><a href="http://www.house406restaurant.com">House 406</a> - Modern, upscale American food. Great place to have a working dinner or to relax at the bar. The food is clean and delicious. The drink selection is well-curated and you can't go wrong.</li><li><a href="http://plateiachicago.com">Plateia</a> - Upscale Mediterranean kitchen. Greek through-and-through with delicious small plates and entrees. I can't get enough of starter feta, olive oil, olives and crust house-made bread.</li><li><a href="http://www.cherrypitcafe.net">Cherry Pit Cafe</a> - Best breakfast for lunch anywhere! Has 50's style counter in front complete with low red stools. Everything is from scratch, simple, and very well done.<br/></li></ul><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20170214_123025--Small-.jpg" class="kg-image" alt="Chicagoland Consulting" loading="lazy"><figcaption>Cherry Pit Cafe skillet</figcaption></img></figure><h3 id="steak-places">Steak places</h3><p>Chicago, being close to the north woods has a strong history with supper clubs--check this <a href="https://www.splendidtable.org/story/the-history-and-renaissance-of-the-wisconsin-supper-club">Splendid Table article on Wisconsin supper clubs</a>. Along the same lines (though not exactly the same) are the plethra of Steak places in Chicago. You can't throw a rock without hitting one. From upscale chain places like J. Alexanders and Flemings to local haunts like Carsons (ribs.com??).</p><h1 id="outdoor-recreation">Outdoor Recreation</h1><p>After a long day consulting the office, it is so necessary to get out and stretch the legs. I am a trail running nut, so I was really trying to find a way to get off the pavement and have a good run.</p><ul><li><a href="http://fpdcc.com/preserves-and-trails/trail-descriptions/#des-plaines"><strong>Des Plaines Trail in the Forest Preserve of Cook County</strong></a> provided miles and miles (20+ miles!) of mostly gravel with a mix of dirt that runs mostly north and south, parallel to I-294. It is a wooded oasis among the suburban sprawl. There are many areas that have recently been clear-cut (hopefully responsibly), but then many jaunts through dense woods, along the Des Plaines River and dams and the best part is it's miles of unbroken trail, even a few bridges over the major roads like Willow, so you don't have to miss a step. It is completely flat and should make for some fast times.</li><li><a href="http://www.glenviewparks.org/thegrove/"><strong>The Grove</strong></a>, part of Glenview Parks, is an interesting run. It's intended for more a suburban quasi-wooded hike and historical center. However, it is literally across the street from the Delta hotel, so I made a quick run out there. It is a beautiful natural area with dense tree cover for a few hundred acres and many swamps. The deer herd is very thick through here!</li></ul><h1 id="hotels">Hotels</h1><p>I stayed in every Marriott-associated hotel in the area--there was a 16 week stretch I stayed a different one every week! I stayed everywhere from the Renaissance Schaumburg to the North Shore and everything in between. Near the end of my tenure I found myself splitting time between the Marriott Suites right off 294 and more often at the <strong>Delta Suites North Shore</strong> (again right off 294 near Milwaukee Ave). The latter offered expansive rooms that were recently renovated (it used to be an Embassy Suites, I believe!).</p><p>Two big reasons for choosing the Delta:</p><ol><li>Running - It was very near the <a href="http://fpdcc.com/preserves-and-trails/trail-descriptions/#des-plaines">Des Plaines Trail in the Forest Preserve of Cook County</a> (more below)</li><li>Food - Definitely not a foodie mecca, but it did have a delicious concentration of Korean establishments. My favorite, hands-down was So <a href="http://sgddubu.com/locations.php?id=5">Gong Dong in Glenview, IL</a> and right down the road from the hotel.</li><li>Spacious rooms - newly renovated and plenty of room to stretch out. I like to do Yoga in the mornings and there is plenty of room for that.</li></ol>]]></content:encoded></item><item><title><![CDATA[Phoenix Coffee Roasters]]></title><description><![CDATA[Spent a month here working remotely from the midtown area (Central near Thomas), so I got to try quite a few downtown coffee roasters. I was surprised that Phoenix offered so much in the way of this hot brew! There were a few places that are top notch in the specialty roasters realm (I'm all about pour-overs). Featured Post Image is some building art of the building right next to the apartment we had rented. Press, Central Ave https://www.presscoffee.com Press has a minimalist, industrial space]]></description><link>https://matt.guide/phoenix-coffee/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151b9e</guid><category><![CDATA[Travel]]></category><category><![CDATA[Coffee]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Sat, 05 May 2018 20:11:35 GMT</pubDate><media:content url="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20180418_165321_v1.jpg" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20180418_165321_v1.jpg" alt="Phoenix Coffee Roasters"/><p>Spent a month here working remotely from the midtown area (Central near Thomas), so I got to try quite a few downtown coffee roasters. I was surprised that Phoenix offered so much in the way of this hot brew! There were a few places that are top notch in the specialty roasters realm (I'm all about pour-overs).<br> Featured Post Image is some building art of the building right next to the apartment we had rented.</br></p> <h1 id="presscentralave">Press, Central Ave</h1> <p><a href="https://www.presscoffee.com">https://www.presscoffee.com</a><br> Press has a minimalist, industrial space with outdoor area in the ground floor of a new, upscale apartment complex. That's not usually the type of place I find great pour over roasts, but I was pleasantly suprised! On their merch rack they have quite a few beans for sale, including a geisha! I actually bought that and am looking forward to brewing it at home.<br> <strong>Coffee</strong> Kenya, single origin. I found it fruit-forward. Smooth, clean finish.</br></br></p> <!--kg-card-end: markdown--><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/18-04-15-08-31-13-8708_v1--Small-.jpg" class="kg-image" alt="Phoenix Coffee Roasters" loading="lazy"><figcaption>Press Coffee Roasters</figcaption></img></figure><h1 id="cartel-coffee-lab-downtown">Cartel Coffee Lab, Downtown</h1><p><a href="https://www.cartelcoffeelab.com">https://www.cartelcoffeelab.com</a><br>Cartel is smack in downtown Phoenix. We saddled up to the outdoor counter-height seating near the sidewalk counter. Since I didn't make it inside, tough to tell much about their operation but the pour over menu looked enticing.<br><strong>Coffee</strong> San Agustin, El Salvador. Citrus, light on the tounge. Flavors of Effervescent (??), blackberry compote, and vanilla.<br/></br></br></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20180421_112038_v2--Small-.jpg" class="kg-image" alt="Phoenix Coffee Roasters" loading="lazy"><figcaption>Cartel Coffee Lab cups</figcaption></img></figure><h1 id="giant-1st-st">Giant, 1st St</h1><p><a href="http://giantcoffeeaz.com">http://giantcoffeeaz.com</a><br>Giant has another industrial, airy space where the entire front wall opens up to the sidewalk to create a very inviting space. They have plentiful and unique seating for those trying to get an hour of work in while sipping brew. Their roasts are very good for pour-over. Found this out later, but they also own Matt's Big Breakfast which offered a cut above the classic breakfast items--similar menu to any classic breakfast place but they do it very well.<br/></br></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20180406_103835_v1--Small-.jpg" class="kg-image" alt="Phoenix Coffee Roasters" loading="lazy"><figcaption>Giant Coffee cup</figcaption></img></figure><h1 id="lux-central-ave">Lux, Central Ave</h1><p><a href="http://luxcoffee.com/">http://luxcoffee.com/</a><br>Wanted to love this place! It is an ultra-hip spot to hang and work on your latest play, book idea, or app. I was overwhelmed approaching this place--even finding the door took a couple wrong turns including entering the architect/designer office that shares the same building. The Lux interior is dark and feels smoky with an eclectic mix of patrons- like the cantina scene in Star Wars!<br>As far as the coffee is concerned... They roast their own on site and love that. However, it is more what I would call <em>classic office coffee</em>. Even the light roast (only drip, no pour over) have the burnt aftertaste. Though, for those powering through a thesis, they do offer a lot of options that will get you through including red eye, macchiato, quattro ice espresso, and discount americano refills.<br/></br></br></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://res.cloudinary.com/mattguide/image/upload/q_auto:good/v1/blog-images/20180408_082155_v1--Small-.jpg" class="kg-image" alt="Phoenix Coffee Roasters" loading="lazy"><figcaption>Lux Coffee cup</figcaption></img></figure><h2 id="feature-image">Feature Image</h2><p>Taken by Matt Zajack near downtown Phoenix on the side of a print shop.</p>]]></content:encoded></item><item><title><![CDATA[All-in-One Gaming, NAS, Docker with UnRaid]]></title><description><![CDATA[How did I chose the on-prem setup? Although I ply my skills in the data science space, I think being a well-rounded, self-sufficient computer expert is a neccessity to stay ahead of the game. One way I stay ahead is to run my own server to find the latest techniques and get my hands dirty. I am also into fast-twitch FPS PC games, so I have some good gaming hardware and wanted to maintain the performance. We travel a lot, so I needed to find some way to make these happen with a physically small f]]></description><link>https://matt.guide/all-in-one/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151b9d</guid><category><![CDATA[Data Science]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Sun, 08 Apr 2018 20:24:05 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1490810194309-344b3661ba39?ixlib=rb-0.3.5&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ&s=859603e411ace402066ec09bb2906f37" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h1 id="howdidichosetheonpremsetup">How did I chose the on-prem setup?</h1> <img src="https://images.unsplash.com/photo-1490810194309-344b3661ba39?ixlib=rb-0.3.5&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ&s=859603e411ace402066ec09bb2906f37" alt="All-in-One Gaming, NAS, Docker with UnRaid"/><p>Although I ply my skills in the data science space, I think being a well-rounded, self-sufficient computer expert is a neccessity to stay ahead of the game. One way I stay ahead is to run my own server to find the latest techniques and get my hands dirty. I am also into fast-twitch FPS PC games, so I have some good gaming hardware and wanted to maintain the performance. We travel a lot, so I needed to find some way to make these happen with a physically small footprint. I had a few options:</p> <h2 id="option1cloudcomputing">Option 1 - Cloud Computing</h2> <p>Perhaps one of the more obvious and future-forward options is to do everything on the cloud! This was very appealing since we push our clients this way. However, I chose against cloud because of</p> <ol> <li>High Cost. Although <a href="https://calculator.s3.amazonaws.com/index.html">AWS Calculator</a> shows nominal individual costs for stoage, compute, and other resources. It's prohibitively expensive to maintain 24/7 access to a wide range of services.</li> <li>Lack of Flexability. It is much easier to play around, forget about, then re-learn tools and techniques when it's on your own hardware.</li> </ol> <h2 id="option2mychoiceonpremiseshardwarewithunraid">Option 2 (my choice)- On-premises hardware with UnRaid</h2> <p>I already had fast hardware and large drives, so I wanted to take advantage of that without further monthly costs of cloud. There were quite a few options for on-prem hardware and software, most of them revolved around simply hosting files. I entertained the idea of <a href="http://www.freenas.org">FreeNAS</a> but ultimately found it to be a bit limited especially compared to my ultimate choice, <a href="https://lime-technology.com">Lime-Technology UnRaid</a>.</p> <h1 id="unraidfeatures">UnRaid features</h1> <p>UnRaid has diverse offerings, most of which the others did not offer:</p> <ul> <li><strong>NAS (Network Attached Storage)</strong> - This involves combining storage (typically TeraByte-size hard disk drives) and hosting volumes for sharing files with devices inside the home network. High-end options allow for some failover with <a href="https://en.wikipedia.org/wiki/Parity_bit">parity</a>, mirroring, or other backup methods with or without a RAID setup--this is probably where UnRaid gets its name, because it is not a RAID setup- more on that below.</li> <li><strong>VM (Virtual Machine)</strong> - This ability allows hosting one or more operating systems on a single hardware structure. This allows one to allocate different amounts of CPU cores, various amounts of RAM, and other resources such as USB devices and graphical resources. Unraid does something unique and allows assignment of complete resources, so, say an entire GPU can be allocated at the hardware level giving full performance</li> <li><strong>Docker</strong>- This one was new for me when I started, and I have become a big fan. <a href="https://www.docker.com/what-docker">Docker</a> is similar to VM in that it allows virtualization but goes so much further. It has operating-system-level virtualization called <em>containerization</em> that allows configuring, duplicating, sharing, etc. of entire application stacks. In this way, one installs a stripped down Unix base image (usually something with CentOS or a variety of very small footprint installs), configures all necessary software which usually involves Ngix or Apache for a web interface, and configures a base application like MySQL (or MariaDB) that is ready to use out-of-the-box. These are typically single-use images so that if you wanted to install a web application like Wordpress, you would install the Wordpress base Docker container and install separately the MySQL container for that Wordpress. This allows easier configuration and makes it more error-proof, since one container can go down without bringing down everything else.</li> </ul> <h1 id="myunraidsetup">My UnRaid Setup</h1> <p>You can see my UnRaid stack below:<br> <img src="/content/images/2018/04/UnRaid-gaming--2-.svg" alt="All-in-One Gaming, NAS, Docker with UnRaid" loading="lazy"><br> Briefly:</br></img></br></p> <ul> <li><strong>Bare Metal Rig</strong> - Contains a 4-core i5 (early model), 16 GB of DDR3 RAM, Nvidia 1070 EVGA FTW (use it for 4k gaming!), 500 GB EVO SSD, and three 2 TB Seagate HDDs.</li> <li><strong>UnRaid OS</strong> - Mentioned above, includes the NAS, Docker, and VM. But breaks down into a numer of components: <ul> <li>Volumes - includes the NAS and file sharing component. This works with 4 TB available to share, and the remaining 2 TB is used as a parity drive for disaster recovery. I opted for this because even if one drive goes down I still retain all of my data, either on the 2 data drives or by recreating the bad drive from the parity.</li> <li>Docker Containers - This is my fleet of containers. I have a Ghost container for this blog, Nextcloud for my home backup and phone photo auto-uploading, a Wordpress container for giggles, and two MariaDB (MySQL) containers one for Nextcloud and one for Digikam photo management software installed on my workstations. Forgot to include one more- I have a DuckDNS.org container to update my IP address.</li> <li>Windows 10 VM - This is my gaming machine. I have allocated 3 physical cores to this machine, leaving 1 for UnRaid/Docker/NAS, given 400GB of my SSD, mounted the entire GPU, and shared a volume for large game file storage. This has given me near bare metal performance. More on how I set it up below.</li> </ul> </li> </ul> <h2 id="hoesunraidwork">Hoes UnRaid work??</h2> <p>UnRaid works on a Debian OS, MergerFS for pooling disks, SnapRAID for the parity drive, and Docker for the containers. Read the <a href="https://blog.linuxserver.io/2017/06/24/the-perfect-media-server-2017/">LinuxServer.io Perfect Media Server</a> article for the whole setup--it's so impressive. I opted to pay $60 for the <a href="https://lime-technology.com/pricing/">Lime-Tech UnRaid license</a> to use UnRaid and am very happy with the choice.<br> UnRaid installs as the base OS and works as a <a href="https://en.wikipedia.org/wiki/Headless_computer">headless server</a>. This way I just press start and forget about it. I SSH into it for special configuration but use the web interface for day-to-day tasks and monitoring including installing containers, managing shares, etc.</br></p> <h1 id="usingwindows10vmasagamingmachinewithunraidbaremetalperformance">Using Windows 10 VM as a Gaming Machine with UnRaid - Bare metal performance</h1> <p>I had a hard time believing it, so I needed to try it myself. I get near-perfect performance running my Win10 VM with the above specs as I do using it as a whole system. I got my computer set up thanks to many online resources which I will share:</p> <ul> <li>First - you <em>have to</em> check out this <a href="https://www.pcworld.com/article/3222652/gaming/how-we-hosted-a-star-trek-vr-party.html">multi-user gaming machine article</a>. It shows how, with UnRaid, you can install 4 GPUs on an i9 10-core system with 64 GB of RAM to let 4 Win10 VMs each have a VR headset to play video games. They assign 4 logical cores, 8GB RAM, and 1 GPU to each instance.</li> <li><a href="https://youtu.be/dpXhSrhmUXo?t=340">UnRaid video guide</a> showing how to do the entire setup. I skipped the physical install portion to get to the meat of the setup. <ul> <li>Tip: In your BIOS, set your onboard video as the default so UnRaid OS will use it, leaving your discrete video free for the VM environment</li> </ul> </li> <li>An issue I had was there was no USB controller to pass through to the VM - UnRaid OS had claimed them all. This post saved me - <a href="https://lime-technology.com/forums/topic/52482-solved-usb-controller-pass-through-issue-no-available-reset-mechanism/">UnRaid pass through USB controller for keyboard and mouse</a>. <ul> <li>Tip: I had to add a line to the UnRaid <code>syslinux.cfg</code> file. It make my keyboard and mouse unavailable to the UnRais OS, but since it's headless it doesn't matter. I was then able to assign the controller to my VM. The line is <code>vfio-pci.ids=8086:a12f</code> where 8086:a12f is the vender ID of the onboard USB port, which can be found in the UnRaid Tools > System Info.</li> </ul> </li> </ul> <p>After all the setup, it works like magic. I hit the Power button on my computer, it takes a minute to boot into UnRaid, then my Win10 VM auto-starts and boots. I have full control on my keyboard, mouse, and monitor just like any normal workstation! I leave the computer running 24/7. My Win10 VM has crashed a few times, but it was mostly due to running out of memory in the UnRaid OS because of large Docker containers (GitLab!!). I fixed it by allocating 10 GB to the VM and didn't notice a hit in performance. Though, I have since dropped a local GitLab CE in favor of BitBucket and am back to the 12GB VM.</p> <h1 id="finalwordsyouwillloveit">Final words - You will love it</h1> <p>The whole idea here is that you can have your NAS and gaming machine on the same hardware. This allows you to double-down your money on one set of hardware to maximize its capabilities without sacrificing performance.</p> <!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Blog Architecture - Static Ghost, Bitbucket, and Netlify]]></title><description><![CDATA[The Journey to Ghost Maybe not surprising, but I have my home server set up as a one-stop shop. I will do a write-up on this in detail. Short version: For the past couple months now I use my gaming rig to run UnRAID, which allows me to keep the a file server (NAS) running 24x7, run my main Windows environment as a VM, and run a Docker environment for hosting application all on the same bare metal hardware. This will focus on the setup I have for hosting this blog: a Ghost docker container runs ]]></description><link>https://matt.guide/blog-architecture-static-ghost-bitbucket-netlify/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151b9c</guid><category><![CDATA[Data Science]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Fri, 23 Mar 2018 19:05:51 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1456611984355-c05be968ebe9?ixlib=rb-0.3.5&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ&s=03a713791b616db3c4d072d58cba7b00" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><h1 id="thejourneytoghost">The Journey to Ghost</h1> <img src="https://images.unsplash.com/photo-1456611984355-c05be968ebe9?ixlib=rb-0.3.5&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ&s=03a713791b616db3c4d072d58cba7b00" alt="Blog Architecture - Static Ghost, Bitbucket, and Netlify"/><p>Maybe not surprising, but I have my home server set up as a one-stop shop. I will do a write-up on this in detail. Short version: For the past couple months now I use my gaming rig to run UnRAID, which allows me to keep the a file server (NAS) running 24x7, run my main Windows environment as a VM, and run a Docker environment for hosting application all on the same bare metal hardware.</p> <p>This will focus on the setup I have for hosting this blog: a Ghost docker container runs privately for writing and editing, I scrape static content using node.js, upload that content using Git to BitBucket, and hook in Netlify for hosting the static content to the web.</p> <p><img src="/content/images/2018/03/static-ghost-website-architecture.png" alt="Blog Architecture - Static Ghost, Bitbucket, and Netlify" loading="lazy"/></p> <p>Let me break down the reasoning for the components:</p> <ul> <li><strong>Ghost</strong>: Light-weight blogging platform. Wordpress dominates the web for content hosting, and I do love it! However, it feels like an 800 pound gorilla--an old one. Ghost is a fresh, light breeze that's laser-focused on textual content. For hosting it would have been easier to shell out the $20/month for a Ghost(pro) blog hosted by Ghost, but I am a bit cheap and wanted the challenge of hosting <em>static</em> content.</li> <li><strong>Node.js</strong>: Open-source, cross-platform server-side runtime environment. Ghost is built on a node.js framework. I had originally thought of spinning up another docker with either Python or node.js, but since the Ghost docker already has a node.js environment it made sense to keep it contained. Using a simple npm command, I can install the necessary components for making a static site- website-scraper.</li> <li><strong>BitBucket</strong>: Web-based version control repository. This was two-fold: I like the idea of version control for this entire website, and I wanted to force myself to learn and use Git commands. I chose BitBucket because it allows me to create a private repository for free, opposed to GitHub which only has free hosting for public repositories.</li> <li><strong>Netlify</strong>: All-in-one platform for automating modern web projects--they call it drop-dead simple deployment workflow. I found Netlify from another article showing how to host static content. Netlify does some pretty amazing things: So, it's a CDN for hosting static content, at the core. However, it hosts this content by tapping into repositories in real-time, automatically. It has a slew of microservices that can make a site feel anything but static (taps into AWS Lambda functions??, forms, identify management). And, uh… it's free for my purposes.</li> </ul> <h1 id="thecode">The Code</h1> <p>The following follows very closely the <a href="https://www.blacksandsolutions.co/blog/posts/static-ghost-blog-step-by-step-guide/">Black Sand post on Static Ghost</a>. I owe my entire approach to it!</p> <h2 id="setuptheghostdockerimage">Set up the Ghost Docker image</h2> <p>Access the <a href="https://hub.docker.com/_/ghost/">Official Ghost Docker</a>.<br> Install this however is best for your Docker setup. I have a GUI interface, but the YML looks like it covers everything. I uses the internal SQL lite setup and avoid the MySQL interface for ease; although, I do like the idea of the MySQL interface for better backups. That may be less necessary once we have the scraping and repo up and running.</br></p> <p>My only real tip here is that you might want to capture the entire container path <code>/var/lib/ghost</code>. I mapped <code>/var/lib/ghost/content</code> and it's everything I need for a running site backup, but I am unable to edit the config or theme files unless I SSH into the instance.</p> <p><strong>Start Ghost</strong></p> <h2 id="installnodejsandwebsitescraper">Install node.js and website-scraper</h2> <p>We will be installing the node components directly onto the Ghost Docker image. The reasoning is two-fold:</p> <ol> <li>Ghost runs on node so the framework is available by default in the image. All we are going to have to do is install the website-scraping components to get it running.</li> <li>We can co-locate the scraping on the same machine to make it faster and more streamlined.<br> I also installed git on the instance so I could run the script and push the changes from the instance.</br></li> </ol> <p>Steps to install and configure the static web scraper:</p> <ol> <li>Log into Ghost <code>docker exec -ti ghost bash</code></li> <li>Update your apt-get <code>apt-get update</code></li> <li>Install Git to upload from this image <code>apt-get install git</code></li> <li>Install the website-scraper app <code>npm install --save website-scraper</code></li> <li>Install additional package for saving static content <code>npm install --save fs-extra</code></li> <li>Create directory for static output in the <code>content</code> directory <code>mkdir static-ghost</code></li> <li>Create the static.js file that will scrape the content. Note, I didn't create this file, it was generously available from the previous guide. You will have to edit the siteUrl to match your system--this is required for the links to properly update:</li> </ol> <pre><code class="language-language-java">const outputDir = './static-ghost'; const siteUrl = 'http://192.168.0.105:2368/'; console.log(`Removing output directory: ${outputDir}`); var fs = require("fs-extra"); fs.removeSync(outputDir); console.log(`Analysing site at ${siteUrl}`); var scrape = require('website-scraper'); var options = { urls: [siteUrl], directory: outputDir, //scrape posts, not just index recursive:true, sources: [ {selector: 'img', attr: 'src'}, //most images are displayed using background-image {selector: 'header.site-header', attr: 'style'}, {selector: 'figure.post-full-image', attr: 'style'}, {selector: 'div.post-card-image', attr: 'style'}, //find stylesheets {selector: 'link[rel="stylesheet"]', attr: 'href'}, //and any scripts used {selector: 'script', attr: 'src'}, //shortcut icon {selector: 'link[rel="shortcut icon"]', attr:'href'} ], //dont scrape external sites urlFilter: function(url){ return url.indexOf(siteUrl) === 0; }, }; scrape(options).then( ()=> { console.log(`Static site generated under here: ${outputDir}`); }).catch(console.log); </code></pre> <ol start="8"> <li>Generate your static content! <code>node static.js</code></li> </ol> <p><strong>Note</strong>: I had an issue with the domain remaining "localhost" for links in the footer, RSS feed button, and a few other buttons. To fix it, I had help from this <a href="https://blog.ambar.cloud/tutorial-ghost-setup-with-docker-compose/">Ghost setup tutorial</a>. I needed to open <code>/var/lib/ghost/config.production.json</code> and edit the value <code>"url": "http://<GHOSTIPADDRESS>:2368/"</code>.</p> <h2 id="setupnetlifybitbucketandcommitthesite">Set up Netlify, Bitbucket, and Commit the site!</h2> <p>The final steps are to move this static content somewhere out on the world wide web! This will be accomplished by uploading to Bitbucket then linking to Netlify.<br> Create a <a href="https://bitbucket.org">Bitbucket</a> account.<br> Create a <a href="https://www.netlify.com">Netlify</a> account and link it to Bitbucket.</br></br></p> <h3 id="commityourstaticfiles">Commit your static files</h3> <p>From your Ghost Docker console, upload to BitBucket.</p> <pre><code class="language-language-git">cd ./content/static-ghost git init git remote add origin https://<USERNAME>@bitbucket.org/<REPONAME>/static-ghost.git git add -A git config --global user.email "<USERNAME>@bitbucket.org" git config --global user.name "<USERNAME>" git commit -m "add static files" git push origin master </code></pre> <p><strong>Note</strong>: Mentioned in my <a href="/first-post/">First Post</a>: I am a git newbie, and I had gotten stuck and deleted my entire content directory after I had it loaded to stage with a <code>git reset --hard</code> DO NOT DO THAT!! Apparently the better way is with a <code>git rm</code> command, but I am not all that familiar with it still.</p> <h1 id="checkoutyoursite">Check out your site!</h1> <p>Netlify will automatically build based on your content. It may take a few minutes to do it the first time. Then you can attach your own domain and do all the other great features.</p> <!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[First Post! again...]]></title><description><![CDATA[I accidentally deleted my real first post because I had no clue what I was doing with git! I did a git reset --hard after initializing and git add-ing my Ghost content directory. After reading into it more, apparently that command will delete everything in the stage and working dierctory. I have a lot to learn about git. So, I am going to try to re-capture my intents for my site. Mission Statement I want to create this site to give back to the web that has helped me so much with work and life. ]]></description><link>https://matt.guide/first-post/</link><guid isPermaLink="false">Ghost__Post__605d096ed764a10001151b9b</guid><category><![CDATA[Data Science]]></category><dc:creator><![CDATA[Matt Zajack]]></dc:creator><pubDate>Thu, 08 Mar 2018 08:33:14 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1480057261736-36852db40e50?ixlib=rb-0.3.5&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ&s=3428124210f81fb4a3d7bc7249d8fc72" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://images.unsplash.com/photo-1480057261736-36852db40e50?ixlib=rb-0.3.5&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ&s=3428124210f81fb4a3d7bc7249d8fc72" alt="First Post! again..."/><p>I accidentally deleted my real first post because I had no clue what I was doing with git! I did a <code>git reset --hard</code> after initializing and <code>git add</code>-ing my Ghost content directory. After reading into it more, apparently that command will delete everything in the stage <em>and</em> working dierctory. I have a lot to learn about git.<br> So, I am going to try to re-capture my intents for my site.</br></p> <h1 id="missionstatement">Mission Statement</h1> <p>I want to create this site to give back to the web that has helped me so much with work and life. I rely on resources far and wide to do just about everything more efficiently, and I believe when I find or make a better way I should return the favor.</p> <p>I also want to do some shameless self-promoting. I am a Data Scientist and want to put my name out there and help innovate across industries.</p> <p>For me, the site will help to:</p> <ul> <li><strong>Bubble-up ideas to present to a wide audience.</strong> I create a lot of content, but it just about always remains hidden in a shared folder somewhere. I want accountability to take the content, generalize it, and then better communicate the business-related outcomes.</li> <li><strong>Formalize thought processes.</strong> I have been doing a lot of deep thinking on how I can contribute at work. My notes are a scattershot of half-baked thoughts and ideas. I want someplace where I have to put them into full sentences.</li> <li><strong>Have some fun!</strong> I like to travel and the consulting lifestyle calls for a lot of it. I want to be able to share some of the food, culture, and character of the different areas I visit.</li> </ul> <p>Cheers!<br> <img src="/content/images/2018/03/IMG_2661_v2_resize.jpg" alt="First Post! again..." loading="lazy"/></br></p> <!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>