Fireside 2.1 ( Double Density Blog Sun, 10 Oct 2021 20:00:00 -0400 Double Density Blog en-us GeoCities And The Sands of Time Sun, 10 Oct 2021 20:00:00 -0400 f9ae8129-e2b1-41ad-89a0-474431635e28 A piece about the demise of GeoCities, the desire to create web content and how the landscape truly has evolved over the last 20+ years. Hey all.

It's been a minute, hasn't it? Or in this case, almost three years, to be exact. The Double Density website has remained dormant and Angelo and I have often joked on the show about how one day we'll get around to updating parts of it—a promise that we've turned into a sort of perpetual joke... Until now.

Ironically enough, this "something new" (in the form of a blog post) is being created out of something old—20+ years old, in fact.

Here we are now, entertain us

For the last few months, I've regularly checked in with a tumblr account called One Terabyte of Kilobyte Age Photo Op (OToKAPO). The account posts a few screengrabs a day of defunct GeoCities websites (I mean, given the fact that GeoCities hasn't been a living, breathing web entity in roughly a decade for most of the world, they're technically all defunct), sorted by their last updated timestamp. The Internet Archive had done a fantastic job of springing into action and saving every piece of web real estate when Yahoo! made the announcement of the domain's closure. Archive Team (another group of preservationists) offering up a 600+ GB torrent of all of the available data... And now, in 2021, there is a tumblr account that posts these little bits of forgotten web lore. A lot of these pages are just 404s but some truly offer a fascinating insight into how a regular web user was interacting with the world wide web during the late '90s and into the 2000s.

I've mentioned this on the show a few times but I got my start using a GeoCities WYSIWYG editor to create a rudimentary website (1) mere weeks before we entered the new millennium. A switch was flipped sometime when I first gained access to the internet in 1997 and when I became the family's web pioneer at the ripe of 12, one that made me want to actually.... interact with the content I was seeing. Countless hours were spent trying to forge some kinda web identity by setting up a homestead (pun intended), all the while hunting for MP3s stored on publicly-accessible web servers (2) (I wouldn't get access to private FTPs for another year or so) and also spamming the message boards on GameSages dot com, all with a view of boosting my post count—all with the endgame of using it like some kinda badge of honour.

Visiting OToKAPO always conjures up these strange feelings of my own habits on the web, habits that go back decades. Apart from the aforementioned activities I undertook, I also spent many a day on IRC servers DALnet and EFnet, and an immesurable amount of time spent on various social media platforms, starting in earnest with MySpace and onwards. (3) I remember the day I learned the word blog early in 2000 (while hanging out in the #beefcake channel on DALnet, hub to only the best South Park fans) and thinking that it was a weird way of describing a wooden log of some sort. It took me a bit of time to truly understand its meaning, and I spent

These weird reminiscence sessions also had me thinking about the shifting nature of internet content itself, and how the last few decades have truly evolved (or, quite truthfully in many examples, devolved) when it comes to the level of content we see and interact with.

Share a little love in your heart

Carving out a digital identity back in the day, being able to share of yourself used to be much more of a dedicated activity. Whether you took part in a listserv and wanted to argue the finer points of the newest X-Files episode or best practices when it came to planting flowers in the Pacific Northeast, or you wanted to create a webpage to share a hobby that you cared about immensely, you had to first learn the ropes of how to actually access these avenues, which were often arcane in nature and only through time lowered their barriers for entry to allow people of all stripes to join in.

Enter 12 year-old Brian, intent on fucking shit up in his little area of the web. Prior to GeoCities' free WYSIWYG editor, I hadn't truly wrapped my head around what coding a website could be like, or how I could get there. I do remember my mom taking out a book from the library when I was 4 or 5 filled with simple computer programs coded in BASIC that she inputted and allowed us to use on our old 486 CGA, but beyond that my access to that was very limited. And suddenly, the doors blew open and I was creating background boxes and throwing text of various fonts (my personal aesthetic as a pre-teen was absolutely chaotic neutral-fuelled) with reckeless abandon.

Yes, I did learn how to (badly) code eventually and even took a C++ class to understand more about how programming languages worked, but my resources at that juncture were very limited. And so I made what I could with what I could... And let's not even get into the storage side of things—I remember GeoCities' infernal 15 meg limit, which severely cramped my style. I also remember a time in the early 2000s when 50megs dot com was a thing and tried, in vain, to get a free account on there so that I could do more. Not having access to a credit card to get server space truly was a roadblock I had to somehow circumvent.

Let's circle back to intent for a sec. Or rather, the notion of wanting to create. The burning desire to share of yourself in a way that perhaps wasn't possible through methods of direct communication (be it via email or instant messenger, or an in-between piece of software like ICQ, which I truly feel was like one of the first bridges between the two modes of connection). You wanted to stake your claim, wanted to have a piece of virtual real estate that you wanted to direct others to. I've seen all kinds of interesting uses of a GeoCities web address on OToKAPO—everything from family reunion photos to professionals' CVs, fandoms of every shape and kind to lists of music bootlegs users were looking to sell/trade, there was a bit of everything, all with this weird homemade flavour to them.

The common link was that these people took time (in some cases, considerable amounts when you think of the coding they'd done) to share of themselves. Believing that their contribution to the world was a positive one. Wanting to weave together knowledge and love of a subject, building towards a more expansive (and in some cases, inclusive) web experience, creating an additive atmosphere for users and surfers alike. (4)

And now, we've taken that maxim of wanting to share to its highest degree.

That's dirty pool, mister

Fast forward to the modern content landscape, and it's easier than ever to create content of all sorts and share it to the web. You can grab your smartphone and point it at yourself and shoot, streaming live to Facebook, YouTube, Twitch or wherever you want. Your digital identity, the ways in which you decide to showcase yourself to the world, is only a few taps away. A lot of marketing around the turn of this century was dedicated to explaining how people can now become brands, and never has it ever been more present. That's all we are—personal brands, whispering sweet nothings into microniches and attracting sponsors to continue the feedback loop creators enjoy getting.

Somewhere along the line we've taken the credo of being "citizens of the web" (every reader of web news from the mid '00s should be wincing now when reading this term, as it was everywhere to describe the way Web 2.0 was "transforming the world" (5)) and cranked the volume knob to 10. We've taken to sharing way too much of ourselves in order to seek attention, validation and detractors. The additive attitude that creators used to largely inhabit has been kicked to its virtual curb (let's keep the web real estate metaphor going) and we've all moved into noisy high rises (hey, I never said I was any good at metaphors either, I'm just using 'em cuz I can), filled with way too much in way too little of a space. We're all a click away. We're prompted to comment, to share, to say more, as we hold the potential to partake in a million scuffles while endlessly scrolling. We no longer have to work that hard to seek more, or to find diamonds in the rough. They are all offered to us as marketers pay sites for our information to then sell us the things they think might interest us the most.

Perhaps we share so much of ourselves that the quality of our sharing is diluted. Or perhaps we share so much of ourselves that our concentration on what makes us truly unique and special, what we really should be sharing with the world, disappears. It's a levelled-out playing field where everything hangs out for everyone to see and Lord knows I am certainly guilty of that (plug plug plug, my Twitter and Instagram accounts are right here). I shitpost entirely too much. I don't have enough of a substantive web (and by that I mean purely web, not social media) footprint that I am proud of... And I'm not quite sure how (or if) I want to change that. Being on the internet for this long has certainly tired me out, and my shitposting has definitely taken a nosedive.

Obviously there are exceptions to all of this in that the internet still exists and individuals (or groups) run websites dedicated to every facet of interest imaginable, but somehow it all feels lesser, as people often treat these sites like the job they are (including the obligatory Amazon affiliate link) instead of the burning fire that rests within them.

What's that, over the horizon?

Over the past year or so I've started subscribing (and in some instances, paying to unlock more content) to several newsletters on various subjects that interest me. Somehow, these pieces of mail that land in my inbox feel like a connection to a bygone era of the internet where I could find out more about a topic, a subject or a person in a purer way. This concept of curated content, tied to an individual, makes me feel as though I am once again clicking through the wastelands of a web that wasn't absolutely dominated by terms indexed by robots.txt. It makes for a more direct approach to ingesting information, in places where I don't also have to contend with the commentsphere descending (or actually, let's be honest, ascending) upon me to share added "context" I didn't ask for. It's B2C, but it's for shit I care about.

These slices of web/life certainly evoke a sense of nostalgia within me for a time I know I can no longer take part in, in the way in which I want to take part in. I'm donning rose-coloured glasses based on 15 years of relentless content being pushed to me (and in some instances, content I myself have pushed onto others), of discussion, of discourse. Signs. Signifiers. Words, images, and their intended meaning. Reconstituted notions of what everything means. Words upon words. Videos upon videos. Flash video (RIP) upon Flash video (RIP2). To quote Bo Burnham, a little of everything, all of the time.

I admit that my experience on the web, the moment in time where I hopped on the WWW trolley, is specific to myself. Others who have either had a lifetime pass for decades, or who have only really hopped on the ride in the last 10-15 years, decided have their own views on how they feel about web content. I own my biases and understand they exist. I'm by no means a perfect user/creator, I'm just someone who exists, who has spent an unsightly amount of time ruminating on what was and what shall be. Cursed with brainworms when it comes to thinking about web content and the ways in which we all relate to each other, and the methods we use to share what's important to us.

At the end of the day I'm just out here on the curb, kicking rocks. See you around the bend, as I await the next onslaught of information to deluge me like a storm of juicy, stinky trash water.

(1) The webpage is a true eyesore, and for that I make no apologies. The idea behind the website was borne out of a hobby a friend and I had become obsessed with—our own fictional radio show, which we would create by gathering around my old Radio Shack boombox's tiny mic hole and just speaking Mad TV- and South Park-poisoned nonsense into. I do apologize, however, for the awful language and truly purile jokes we made as pre-pubescent boys.
(2) The first MP3s I remember finding (and being excited about) were Kid Rock's 'Cowboy' and Blink-182's 'All The Small Things'. Wonders never cease to amaze.
(3) I also used to maintain a Xanga blog for years and years and a recent dive into those torturous words of my youth brought moments of self-cringe so incredible that I could only take so much of it before clicking away (thanks, Internet Archive).
(4) I know this is an overt generalization and for the sake of the piece I'm willing to make this claim. I know that extremist groups and other morons have also used the web to communicate disgusting ideologies throughout the last few decades and this piece isn't mean to whitewash that—what I'm getting at is that the web was largely an additive atmosphere, and this sort of behaviour was truly in the minority.
(5) I often treated these terms in the same way as the Gabbo episode of The Simpsons, when they'd show the Gabbo preview commercials... GABBO. GABBO. GABBO.

When Horror And Tech Mix Tue, 30 Oct 2018 05:00:00 -0400 1c3b80a8-5cea-4383-b219-6c273fcbf99b An exploration of how technology both hurts and helps the horror film genre. To weave a tale...

A quick note before we begin: I’ve linked to several videos in this essay that are very graphic in nature, so please keep that in mind while clicking. Happy reading!

There is a claim out there that technology, at its core, is neutral. It is described, these folks allege, as a tool at its base, a resource for the human race – whatever meaning and uses we attach to technology is borne of our own biases, deficiencies and desires. It does not advocate for one stance over another when it comes to a multitude of moral questions.

For the last century, technological advances have allowed filmmakers to tell stories in increasingly complex and visually arresting ways - but is it really worth it? Are these advances simply smoke and mirrors meant to dazzle an audience, or do they truly enhance a filmmaker’s ability to actualize their story without sacrificing.

Nowhere is this more apparent than in the horror genre. The culture that has been created and that surrounds this type of film is amongst the more diehard in all of the realm of motion picture fandom. For decades, people have lived to be titillated, frightened and excited. There exists many figureheads who have found success in different subgenres – auteurs from around the world have managed to sustain long-running careers out of the dark, the lurid, the scary. And they’ve often employed cinematic tricks to be more effective in their storytelling.

Let’s go down the horror rabbit hole and explore both how technology detracts from the horror genre, and how it also aids the genre.

The monster lives

There is something palpable about effects created in the same space where the action takes place. True, some effects are shot apart from when principal photography is undertaken, but there is something about these bits of a movie existing in a real, physical, shared world that makes immersion a lot more easy, in some ways.

No matter how much we perfect post-production effects, I feel as though there is still an inauthentic feeling when you view the latest instalment of a Halloween blockbuster, like when a computer-created ghost suddenly whisks itself into a series of frames, evoking a classic jump scare in the viewer.

I understand that there are financial limitations for some productions who decide to enter the realm of the horror film, and for a long time this has kept some at bay and made others turn towards digital effects houses. Now, this may be the snobbiest thing I’ll say during this entire essay: practical effects rule, and they are second-to-none when it comes to the horror genre. They create a purer movie viewing experience for a number of reasons.

Take the practical effects of John Carpenter’s remake of The Thing - the way that the parasitic spider being/head crab exists in that remote outpost, sharing space with the actors in scenes, and inserts shot on-location. There’s a real tension to the way in which the action and tension is portrayed.

Picture one of A Nightmare On Elm Street’s iconic kills - Johnny Depp’s Glen getting sucked into a bed, and then the bed becoming a virtual blood geyser as it spits him back out. An ungodly amount of fake blood, gushing out. The lurid reds of Glen’s blood bulging out, imprinted on the film in which it was shot on, has a certain cohesive feeling to the entire experience. These practical effects, of course, are artificial but they exist, and are concrete. You can touch the materials used to create these effects. They were put together by hand, in a shared physical space, most likely an effects warehouse or workshop.

Let’s take a more extreme example: Peter Jackson’s 1992 horror/comedy hybrid Dead Alive (also known as Braindead) features a scene where Timothy Balme’s Lionel enters a home with a lawnmower, cutting down hordes of undead beings as blood and chopped limbs go flying around. These practical effects are all done in-camera, congruent with the narrative being told. Everything’s being left in the frame - we aren’t altering its composition or trying to include something that hadn’t existed outside of the context of what the camera captures.

These effects, which admittedly can sometimes be imperfect in creation and execution, bring their own flavour. Think of every kill Michael Myers undertakes in the Halloween franchise’s first few films (barring, of course, Halloween III and its gruesome maggot-filled ending) - they all happen right in front of us, as someone stood there filming these sequences. They soak up the same energy and benefit from existing in a physical reality.

Legends in the field of special effects like Tom Savini and his protege Greg Nicotero have been able to translate words on a piece of paper into spectacles that still delight, fright and haunt us to this day.

Visual effects created in a digital landscape add a layer of artificiality to film. Beyond fictional narratives being acted out by people hired to take on roles, we now introduce more artificiality to the proceedings, further removing the conceit of cinema as real. And in the horror genre, being able to pass off the narrative as “real” is essential in most cases - it creates a version of reality where the events that take place could be construed as real and have to lull the viewer into buying into the film in order for the jump scares and frightening moments to be effective.

Sidenote: There also exists a cottage industry of “so bad it’s good” films that, in their own strange little way, celebrate the conceit of cinema and exist (sometimes on purpose, but mostly by naivete) to excite audiences, who have mythologized certain terrible films (such as Troll II or Birdemic) in a knowing, “wink wink” way that real gorehounds trade when they get together either online or in a shared physical space, such as one of these special screenings. The fact that “revered” (to use the term broadly) 1960s stinker Manos: Hands of Fate got a great Blu-ray restoration put out by Synapse Films should be a testament to horror fans at least having a sense of humour about themselves.

Yes, yes, of course – most movies are now shot on digital mediums and therefore some of what I’ve said before is now negated for reasons of practicality, but I still contend that in-camera practical effects enrich the filmic experience and offer greater immersion to the viewer.

The modern viewer: a minefield of adverse issues to overcome

Technology has also robbed us of the ability to be genuinely surprised. We are smack in the middle of spoiler culture, and the notion of filmic myth is shattered due to our constant connectivity.

In the early 1980s, Ruggero Deodato's gory Italian faux-documentary Cannibal Holocaust was a circus unto itself - Deodato made his cast sign agreements to not publicly appear in any public productions for a year after the release of the film. While this plan backfired and Deodato was forced to produce his actors in court when charges were levelled against him, his intent was nothing short of old-school theatrical. The myth of these dead characters seemed plausible because of the fact that it took more effort to locate and identify the film’s players due to the modes of communication of the time.

A subgenre also entered the limelight in North America during this time period that blurred the lines in-between real and fake - the Faces of Death-style films created in the spirit of the mondo film.

Mondo, or “world” in Italian, is a type of exploitation film that sensationalized the ways in which different subsets of people in the world lived to create shock, horror and awe in the Western viewer. First popularized in Italy in the early 1960s (the most famous example, perhaps, is Mondo Cane), these faux documentary films purported to show how life really happened out in “the wild”. It is a deeply problematic (read: racist) subgenre of film that was effective in certain ways because viewers had no way to independently verify the claims made in the films lest they themselves decided to travel to (for example) South America or Africa to ascertain the validity of these assertions.

Faces of Death largely put aside the globetrotting aspects of the original Italian mondo film and instead focused on blending reels of graphic news footage that the filmmakers procured through a number of means and obviously cheap-looking fake scenes involving cults (a buzzword that could easily set the public off when used in the ‘80s) in an effort to pad running time, all with a framing narrative delivered with lurid delivery by an actor purported to be a medical professional, delivering laughably over-the-top narration to sell the value of these clips. Through a law of diminishing returns, the series was watered down as budgets dwindled and interest dropped off as the the 1980s ended, putting the notion of fake-as-real films to rest in popular culture.

Over the following decades, however, these modes were transformed in such a way that it made it much easier for people to connect, with both good and bad repercussions.

Jump forward to the middle of 1999 - a summer soundtracked by Carlos Santana and Rob Thomas' “Smooth” - we have a pair of movies that contained twists and spoilers meant to throw audiences.

The first, The Sixth Sense, about a boy who sees dead people, is a tale about mortality and its twist during the film’s third act revolves around the fact that the man who guides him on his journey is a ghost himself. This was, of course, a narrative slight that was so effective because of the fact that the story was carefully crafted to shield the viewer from this fact until it was used to great effect, in the same vein as Keyser Söze’s reveal.

The ways in which this film could be spoiled were largely through word-of-mouth interactions propagated by asshole friends, coworkers and acquaintances who proto-edgelorded in a way to anger those around them. The commercial web, still in its 1.0 infancy, did offer up similar spoilers and a breakdown of plot threads, but since web culture wasn’t as prevalent, it was largely these physical interactions that served as a pretext for spoilers.

The second, the Blair Witch Project, was arguably what began the modern found footage horror movie phenomenon. Although films like Henry: Portrait Of A Serial Killer flirted with the inclusion of home video-grade footage as part of its narrative, the basis of its storytelling approach was not to in “reconstructing” reality and passing itself off as real - Henry was a film, through and through, with its non-diegetic music and linear camera movement.

The Blair Witch Project was perhaps the first modern horror movie to try and present its contents as "real" through consumer-grade equipment. While Henry and the other films strove to try and present themselves largely as pieces of fiction despite their budgetary shortcomings, this was really a film trying to disguise itself as a sort of "truth on video and film".

A largely improvised film with a remarkable ending, the creators of the film also used the internet, just entering a period of unparalleled accessibility in North America, to create a website filled with content related to the film to try and call into question what to believe in when it came to the film’s lore.

Modern audiences don't have the luxury of organic discovery when it comes to pop culture.

Now, within hours of any Hollywood premiere, we can scan a film's Wikipedia entry for all of the relevant plot points. The magic of mystery is robbed. When The Force Awakens came out in December 2015, there were choppy stills of Han Solo's death splattered across large corners of the internet within hours of the film's wide release.

Trolls would insert this info into non-Star Wars-related discussions across message boards and social networks. A campaign was seemingly waged to spoil the entire thing. The only way to properly avoid anything would be a cold turkey approach to the internet: cut it off, or else.

Technology has inserted itself firmly in-between viewer and viewing experience, a buffer that distracts and creates chaos in the relationship in-between film and film buff.

Sometimes, though, technology has the ability to empower people who may otherwise not have the financial means to create the sort of film they wished to create, with a look and feel that can be measured to any studio offering.

This is how we do it

Technology, of course, can also be used effectively to convey a story for a fraction of the price it would have cost even 20 years ago.

Cheaper video equipment appearing on retail store shelves opened up an avenue for filmmakers (who, in some cases, we can refer to as auteurs because their brand of film would certainly be referenced as… unique, to put it mildly) to shoot their pieces on affordable, readily-available consumer-grade video, starting in earnest in the 1980s. Films like Germany’s bluntly-named Violent Shit, Woodchipper Massacre (I’m unsure if you can really call it a massacre if only two people are murdered, but I digress), Video Violence (and its sequel), Canada’s Things, and more all felt like winking nods to the horror genre, but these never strove to be considered authentic - and by authentic I mean that their narrative is anything but an artificial construct.

I will note an affinity for a lot of these entries, especially something as outrageously (and sometimes unintentionally hilarious) entertaining like the Todd Sheets-helmed Zombie Bloodbath trilogy. The first film, in particular, is fun to watch – pay attention to the mulleted fellow who appears throughout the film (and turns undead during the film’s runtime) and how he makes his way into several scenes, as geographically-impossible as it may seem.

Although critics first labelled 2005’s Hostel with the term torture porn, the truth is that much more convincing films had been in existence for almost 20 years at that point, utilizing consumer-grade video equipment to drive home the point of realism. Case in point: the Guinea Pig film series, from Japan.

The first entry, The Devil’s Experiment, captures unknown assailants kidnapping a woman and conducting tests on her to test the human threshold of pain – the entire film lasts about 45 minutes but watching it feels much longer, probably due to the harrowing nature of the activities presented. Unflinching and presented, this is probably one of the finest examples of the effective use of consumer-grade video equipment.

The second entry (Flowers of Flesh and Blood), involving a man in a samurai getup kidnapping and dismembering a woman, is also shockingly effective. This is technology harnessed for maximum shock value, and it largely works… So much so that the lore surrounding the film has grown, mostly due to the much-told tale of Charlie Sheen viewing Flowers of Flesh and Blood and believing it to be real, contacted the FBI. They then opened an investigation but quickly dropped it when the filmmakers provided the bureau with extensive making of footage documenting every nasty trick undertaken.

I’d be remiss not to mention the last two entries in the series (Android of Notre-Dame and Mermaid in a Manhole) are a definitive shift in tone for Guinea Pig. Gone is faux-real feel of the first few films – in its place is a strange and, dare I say it almost kitschy quality to the tales that seem like a ray of absolute sunshine when put up against the start of the series, and the first two entries in particular.

Advances in technology have democratised the ability to crank out a piece of budget-minded video nastiness. Gone were the days where you had to oversee a budget largely governed by how many feet of film you could shoot and develop.

Filmmakers of the current generation can cheaply buy high-definition grade cameras and lenses, and storage mediums are cheap and plentiful. You can easily dump your footage onto a portable hard drive after a day of shooting and begin to edit on a digital editing suite pretty much right away. It’s no coincidence that the found footage subgenre of the horror film has exploded in popularity over the last two decades thanks to these advances.

The most obvious example of the modern found footage trend is the original Paranormal Activity film. In a post-The Blair Witch Project world, this was the first franchise to spring up and weaponize the trope to great financial returns. Released in 2007, the film was shot on an estimated budget of about 15,000 dollars and went on to gross over 193 million dollars worldwide, making it, ratio-wise, the most successful film released by a major Hollywood studio. But the studio system isn’t the only one to take advantage of the perks of digital-only filmmaking.

Take, for example, the indie studio produced trilogy of V/H/S anthologies - gruesome stories that are told largely through a variety of media by genre directors of some renown like Ti West, Gareth Evans and Adam Wingard. (Aside: the second entry in particular is worth at least one viewing and both Angelo and I give it our patented Double Density seal of quality.) There’s a zombie tale as told by a mountain biker wearing a GoPro. A documentary crew witnessing what may be the end of the world. A tale of a hostile alien threat as told through video chat. These stories take advantage of the limitations of the technology in their employ in order to more effectively tell the tales.

Now, I have to take a second here and mention that although the found footage subgenre is one that has taken prominence, more traditional forms of filmmaking have also utilized newer forms of technology to further their storytelling.

One of my favourite films of the past few years is the 2012 zombie film The Battery. Shot on a purported budget of 6,000 dollars, the crew spent most of their money on catering and equipment rental, allowing them to borrow lenses that gave their film a gorgeous feel.

I have to underline the idea that I’m making a differentiation here - when I say that technology has aided filmmakers, I mean the tools at a filmmaker’s disposal have made life easier (ostensibly, and tongue-in-cheek, the means of production). Camera equipment, digital editing suites, and media storage options are readily available from retailers. These are all ways in which cheaper costs and a substantial rise in equipment quality make for an easier barrier of entry. I’m still stanning for in-camera action, with as little post-production meddling as possible – I’m advocating for the usage of physical space to execute what’s required for a film’s story.

It’d be naive of me to say that this lowering of the barrier has only had beneficial impacts on the film industry. There is a also a lot of run-of-the-mill fare out there now vying for attention. A quick glance at several platforms on smart TV apps, for example, display a very crowded field from which to choose from. I’ve watched a few myself and a lot of these movies commit the cardinal sin of being competently-told stories that run through the regular tropes - in short, boring, formulaic by-the-numbers affairs. Neither horrendously awful (entering into amusing territory), but certainly landing on any critics’ year-end lists.

Another upside to technological advances in general is that media literacy has also played a large part in how viewers now approach film viewing. The use of technology as a tool for fact verification . Take, for example, the mondo films I’d mentioned earlier. They wouldn’t as easily pass muster now given the exposure viewers have had about how the world operates, and beyond that, they can also easily demystify the claims made in the film through a simple search engine search. This makes for more well-informed viewers, and forces filmmakers to become more and more creative with how they want to present a story, understanding that the tricks of yesteryear don’t hold much weight with viewers’ expectations.

Pulling the curtain back

From the days of silent fare like F. W. Murnau’s Nosferatu to the monster pictures of the 1950s, to the gore-soaked 1970s and ‘80s, up until now, the horror genre runs the gamut of interesting tales bolstered by ever-emerging ways in which to bring them to life.

Technology has made it easier for those who want to to enter the arena to do so, although this has also created a level of market saturation that makes it harder to find quality content. These advances have also made the creation of special effects more affordable, much to the detriment of some of us who agree that tactile special effects are superior. Technology has also made it harder to avoid learning almost everything about a film before its release - a loss of mysticism that the genre can benefit from.

How we view these films, and the knowledge we possess about how to consume media, is becoming more and more sophisticated and filmmakers are continually trying to figure out how to keep this in mind as they create new entries into the horror cannon.

New iPhone Tue, 17 Oct 2017 07:00:00 -0400 796e2936-22f2-4f16-bf7c-459760e4bf68 After much internal debate, I recently picked up an iPhone 8 Plus. My carrier offered me an excellent deal, so I decided to join the Plus club. After much internal debate, I recently picked up an iPhone 8 Plus. My carrier offered me an excellent deal, so I decided to join the Plus club. Initially, the larger size was a bit of a shock, but I quickly got used to it and I really don’t feel it’s too big anymore (my wife says it looks ridiculous in my pocket, though). A lot has been said about the iPhone 8 Plus and I would urge you to go out and read one of the many excellent reviews out there. I really want to concentrate on my own upgrade experience, and how it’s improved since I last upgraded in 2015.

I specifically want to talk about the process of restoring my data on my new iPhone and how that worked out this time. I have never had any luck with this task; from moving my data from my iPod Touch to my iPhone 4, up to the transfer from my iPhone 5s to my iPhone 6S. That last transfer was especially tricky because there was the added element of an Apple Watch. Each time something went wrong, but it did get better from phone to phone.

The iPhone 4 transfer caused crashes and made the phone unusable, so I had to start fresh. In that case it made sense because I was going from an iPod to and iPhone.

With the 5s to the 6S, it was more cosmetic, in that some apps were zoomed in, but it annoyed me enough that I decided to start fresh as well. Each time, starting fresh made me lose some data, but it was never anything essential.

Apple has vastly improved importing data into a new iPhone (well, most data - some settings and passwords have to be reentered) intact. The first thing is that there is now a quick start that transfers over a lot of settings before you even have to access iCloud. It uses the cool particle cloud image that is also used in the Apple Watch pairing process to get the two iOS devices talking. Once this is all done, the new iPhone asks if you want to bring over the latest iCloud backup. This is where I was tempted to just go with a new install, but I decided to give the iCloud restore a try, and after about a week, I have noticed no weird issues like I had in in previous attempts. I did the exact same process with my wife’s iPhone 5s to my/her iPhone 6S (with a fresh battery which has made a HUGE difference) and she has been happily using that phone too. I can now safely tell people that restoring from an iCloud backup is a great option and is definitely the way to go.

I am really enjoying my new iPhone and it’s a pretty big update coming from an iPhone 6S, especially since I moved up to the Plus size with the dual cameras. Taking pictures has never been this great on an iPhone. The cameras have always been excellent, but the pictures I’ve been taking on this phone are incredible. If you’re interested, you can go check out my Instagram account for a few pictures I’ve taken with it. The portrait mode is especially fun and when it works, the pictures it takes are wonderful.

I can safely say that not waiting for the iPhone X was the right move for me. The fact that the demand will be so high and the supply so low makes it clear to me that my carrier would not have been as eager to give me such a great deal. I may change my mind once people start raving about the iPhone X, but for now I'm pretty excited about how great this iPhone 8 Plus is.

iPhone Choices Fri, 29 Sep 2017 09:00:00 -0400 ad76ac3f-1c17-442d-99ca-4a8120baf219 On Friday September 22, the iPhone 8 started shipping, and it’s the best iPhone ever, like every iPhone released every previous fall ever was. The difference this fall is that it’ll only be the best iPhone ever for a limited amount of time. The iPhone X was announced at the same time as the 8 during Apple’s Fall 2017 keynote presentation, and it is the future of the iPhone. For the first time, Apple is releasing a flagship iPhone in a short amount of time after releasing what would normally be the flagship iPhone. The best iPhone ever... for a few days

On Friday September 22, the iPhone 8 started shipping, and it’s the best iPhone ever, like every iPhone released every previous fall ever was. The difference this fall is that it’ll only be the best iPhone ever for a limited amount of time. The iPhone X was announced at the same time as the 8 during Apple’s Fall 2017 keynote presentation, and it is the future of the iPhone. For the first time, Apple is releasing a flagship iPhone in a short amount of time after releasing what would normally be the flagship iPhone.

Who is iPhone 8/ 8 Plus for?

I’ve been wondering who will be getting the iPhone 8 now instead of waiting for an iPhone X. Having passed by the downtown Montreal Apple Store on the morning of the 8 and 8 Plus release date, there was no huge lineup. The diehard iPhone fans that buy every iPhone on day one will not be getting the 8 and 8 Plus - they’re all going to wait for the X. This seems pretty obvious: why get a regular iPhone when you can have a technically-superior iPhone just a few weeks later?

Most people think I’m one of those people that needs the latest thing, but I’m really not. The only reason I upgraded my iPhone 5S to an iPhone 6S was because we needed a second iPhone for the family, as my wife’s Android phone was completely dead. The plan for the next upgrade is to get a new iPhone sometime in the late winter or early spring, and then have the battery replaced on my iPhone 6S since there’s a repair program available for my model. This way, next spring, my wife will have an iPhone 6S with a fresh battery, and I’ll have either an 8 Plus or X.

This is where I’m faced with a decision. Do I go with latest and greatest, or with the evolution of a proven formula? The tech geek in me is saying that it’s no contest: get the iPhone X. However, the budget conscious dad that wants to be able to go to Disney World every year (Ed note: el oh el) needs to get the best bang for my buck. With everything pointing to a severely constrained supply for the iPhone X, I’m thinking that by next Spring, the price of the iPhone 8 Plus will be pretty low through my carrier. I would need to sign a two-year contract, but that doesn’t bother me since it’s always less over the course of two years than buying the phone outright.

What will the iPhone Landscape look like a year from now?

What the future holds for the iPhone X is what is making me deeply consider the iPhone 8 Plus. There’s safety in buying the iPhone 8 Plus, since it’s the fourth refinement of the iPhone 6 form factor. At this point, Apple has worked out most, if not all, of the hardware quirks (like the bending that happened with the iPhone 6 Plus). The iPhone X is a new product and even if there’s a “ten” in its name, it really is an “X” in that we don’t know what to expect. Apple is usually pretty good with first generation products (ex. Apple Watch). However, what will the iPhone X look like after a year of iteration?
The most obvious answer is that Apple will be better at making the model. Everything points to the iPhone X being difficult to manufacture, both in obtaining components and putting them together. This in turn drives up the price. Supply will not meet the demand for several months at the very least. An edge-to-edge screen is not new, but shipping it in the quantities that Apple needs to provide and the fact that it’s a higher quality OLED screen than others, does make it more problematic than previous iPhones that featured standard screens.

Future iPhone Xs will be available at a lower price. Prices will come down for the flagship model, and of course, Apple will have the previous model around for purchase as they have done for several years. At least, this is my hope. It could go the other way: the new, higher price is here to stay, even once the iPhone X model iPhone becomes the only type of iPhone.

Notches and Batteries

I don’t see the notch going away for at least a few years. Ultimately, Apple wants everything to go away, leaving only a screen - that was the whole premise of the iPhone and the multi-touch screen; nothing but direct contact with what you’re doing on the device. The thing is, with everything that is packed into that notch, it’ll be a while before that happens. The next few years will continue with camera improvements and chip speed and efficiency. There will also likely be a version with a larger screen surface.

My biggest hope for a future iPhone is an obvious one - a much longer lasting battery. During the work week, my iPhone is used a lot, mainly because I listen to podcasts. My iPhone 6S lasts me through the day only because I charge it at my desk back to 100% at least once per day. On the weekends, this isn’t much of an issue because my iPhone is mostly on standby. However, an iPhone with double the battery life would be a huge change.

Writing this has convinced me that I may eventually want to get an iPhone 8 Plus; mostly for the better camera and battery life, and since I don’t want to pay the premium that the iPhone X will carry. The demand for the X will likely lead to a lower-priced 8 Plus once I decide to get an iPhone early next year. It’s all up in the air for now. The only thing I know for sure is that I’m getting a new iPhone sometime in the next few months, and probably going to Disney World next summer.

Equifax and You Thu, 21 Sep 2017 07:00:00 -0400 f75944ee-4c2f-4d8b-af20-1e304b334344 The Equifax hack has affected over 143 Americans, and countless other UK and Canadian citizens. Here are some of the most striking things about the incident. Earlier this month, American credit-reporting corporation Equifax announced that it had been the victim of a database breach that left over 143 million Americans’ personal information vulnerable, alongside an undisclosed amount of Canadians and UK residents.

The reason the Equifax breach is so significant is due to the sheer amount of highly sensitive personal information it stores on its servers. The company’s pervasiveness is due to its services being used by a myriad number of banking institutions when it comes to vetting people for anything from a loan to verifying credit history.

A little trading amongst friends

In-between the time that the initial breach was discovered (late July) and public disclosure (early September), several higher-ups at Equifax sold off large chunks of their stock, leading one-third of US Senators to ask the Securities and Exchange Commission and the Justice Department to investigate the individuals for insider trading. Executives sold off stock and netted almost two million dollars in early August. The timing seems to be a little suspect, and many want the US government to intervene and investigate the issue at hand.

The luck of the draw

The troubled company is offering the public a chance to verify whether or not it has been part of the breach through its TrustID product and website. The catch, however, is that the website can’t actually tell if your information has been compromised and will return a random outcome, not actually allowing you to understand if you have actually been affected. People who have entered the same information a number of times will receive randomly-generated yes/no results, and the site will also prompt users to continue to enroll in the TrustID program on the results page. The Terms of Service for TrustID appear to make it more difficult for claimants to pursue the parent company if they do sign up for the product, and many believe the site to be an easier way for Equifax to discount claimants in the future once the eventual class action lawsuit hits the courts.

Suing for fun and profit

If you’re one of the many Americans affected by the breach, the DoNotPay chatbot can help you file papers. Originally designed to help combat parking tickets, DoNotPay creator Joshua Browder (one of the many affected by the breach) hopes that the bot will help “bankrupt Equifax”. User beware, however: local laws governing the filing for damages differ from state to state, so depending on where you are, it may be a bit more of an involved process when it comes to properly filling your claim.

Canadians may feel the burn too

Equifax Canada is also feeling the mounting pressure to disclose how many Canucks have had their private information leaked out. The Canadian Automobile Association (CAA) recently announced that it had partnered up with Equifax as part of an identity protection program, and had about 10,000 members participating. It appears as though the American arm of the company was the one responsible for housing CAA member information, though the parent company remains tight-lipped about the extent of the breach for those outside of the United States.

South American troubles

It has been revealed that the American arm of the company isn’t the only one with security holes that needed to be addressed; it’s been reported that the Argentinian portion of the company had the credentials for an employee tool set to “admin” for both the username and password. The tool, a web application called Ayuda (Spanish for “Help”), allowed users access to the personal information of tens of thousands of people living in the country.

Blaming the Open Source guy

While Equifax was out there lobbying for easier regulation prior to its late-summer data breach, an anonymous source has been quoted as saying the breach occurred due to a vulnerability in the open source Apache Struts server framework. There are several issues with this theory, which are outlined in the above weblink, making it hard at this time to say whether or not that this is indeed how hackers gained entry into Equifax’s databases.

The future, or lack thereof

The US Federal Trade Commission (FTC) has recently taken the extraordinary step of issuing a statement that it is in fact presently investigating Equifax]( The FTC usually does not comment on whether or not it has active investigations and probes, but due to the extensive nature of the breach, it felt compelled to let the American public know that it is looking into the matter.

However this situation plays out, it does make it clear that there is a need for stronger guidelines and stronger implemented security features in place when it comes to the storing and disclosing of sensitive information. It may take years for the process to play out, given the many legal tangents that have appeared in the week since this story came onto the scene.

If you believe you may have been affected, a user over at the YouShouldKnow subreddit has a pretty good jumping off point to figure out actionable items that you can undertake in order to protect yourself. While you cannot change what’s already happened, you can at least step up to prevent potential future damages to yourself.

RetroPie's Nostalgia Thu, 14 Sep 2017 09:00:00 -0400 983429cd-13db-4a98-a819-90e711c54bf2 I feel most nostalgic when I think about the video games I played as a kid, and as someone approaching 40, I'm sure most people of my generation feel the same way. This is why Nintendo has hit it big with the classic consoles it's released in the past two years, and it's also why retro games can carry such a relatively large price tag. I have a few of my old consoles, namely my NES and SNES, and they actually work well. I was planning on getting the SNES Classic, but that looks like it may be more difficult than I thought it would be. Nostalgia

I feel most nostalgic when I think about the video games I played as a kid, and as someone approaching 40, I'm sure most people of my generation feel the same way. This is why Nintendo has hit it big with the classic consoles it's released in the past two years, and it's also why retro games can carry such a relatively large price tag. I have a few of my old consoles, namely my NES and SNES, and they actually work well. I was planning on getting the SNES Classic, but that looks like it may be more difficult than I thought it would be.

Plan B

As I've mentioned in a recent post, I decided to set up a RetroPie, which is an OS you can load into a Raspberry Pi computer and easily get old videogames running via emulation. Now that I have it up and running, I couldn't be more happy with it. It's the easiest way to play the games that I spent hours with in my youth. Unfortunately, most of my cartridges have disappeared over the years, with only a few remaining (luckily they're three of my top five games). Having spent a bit of time playing with the RetroPie, the best part of this has been the opportunity to play these games again with my kids; showing them where stuff like The Legend of Zelda: Breath of the Wild came from.


The Raspberry Pi I ordered came as part of a package from CanaKit through Amazon, which provided the motherboard, heat sinks, case, and appropriate power adapter. I also ordered a 32GB SanDisk Micro SD card which allows for plenty of storage. There are some great utilities that easily allow one to get the RetroPie OS on to the SD card. Apple Pi Baker made flashing the SD card with RetroPie really easy. Once this was done, RetroPie was ready to go into my Raspberry Pi.

I hooked it up to my TV and I set up all the necessities, including wifi, since that's how I would be transferring over the roms I had waiting on my Mac. This process allowed me to try something I never really needed to do before: access a server on my network. To get the roms into the RetroPie, I had to get to its file system by using the Go menu in the Finder and connecting to it as a server. I just needed to input smb://retropie and then access the server as a guest. With that done, I just needed to find the correct folder for the type of rom and the roms were transferred. By heading over to the RetroPie and then restarting the Emulationstation front end, the roms were now available for use. All the details on how to do this were easily found online


There were a few issues that I needed to address once I had everything set up. Since I transferred my roms from a Mac, ._ files were created in the menu. They don't actually cause issues (since they're normally hidden), but they do look ugly in the RetroPie interface. I had to go into the file system on the Raspberry Pi and it was easy to delete them quickly by finding them in the /roms folder and in the corresponding folder (ex. nes). Holding shift and selecting them one by one allowed me to delete them in one shot by pressing F8. If you just have a few, you can also delete them without going into the file system, by using Emulation Station interface.

Once I removed all of all the ugly useless files, I had to start using the scraper to find the box art for the games. The scraper utility accessess the internet and looks for information that matches the file, attaching information to the game so that you can see it. The included scraper works fine in most cases, and I only have a few left that don't have proper box art. I'm working on fixing this, but it's not a major issue.

By far, the most complicated thing I have had to figure out was how to get Castlevania: Rondo of Blood working on the PC Engine CD emulator. To emulate the PC Engine CD-ROM (or TurboGrafix-16 if you prefer), you need to add a special file to the BIOS folder of the RetroPie. That was the easy part. I still could not get the game to run so I had to dig a little deeper. I discovered just how deep when I found this article explaining exactly what I needed to do. There was a line of text that needed to be fixed (it didn't match the file name), but once I did that, I was actually able to finally what many consider to be the best of the traditional Castlevania games. It wasn't super hard to do, but I would classify it as advanced.

Morality and Closing Thoughts

Some people may be bothered that using roms to emulate games is at best a grey area, and at worst piracy. The problem is, a lot of these games are essentially impossible to obtain or in most cases, overpriced. Not everyone has access to these systems either. Sure, Nintendo is releasing the SNES Classic, but as I said, they're not making it easy to get with what already seem to be supply issues. It will be even more difficult in Quebec, since two of the games on the North American system are not in French, vene though the games are available in French elsewhere. Putting together the RetroPie was a great learning experience, making it worth entering a little bit of a grey zone of piracy. Most of these games have been out of print for years, and no one is really making any money off of most of them at this point anyway.

I've always liked thinking about retro games, and I listen to a lot of retro game-related podcasts. One of my favourite YouTube channels is GameSack, and they've pointed me to some of the best games I missed when they were actually available. Being able to play games on the consoles I grew up on now would be great, but I don't have access to most of them. Playing the same games on a RetroPie isn't perfect, but it does scratch that retro itch.


RetroPie Sun, 03 Sep 2017 18:00:00 -0400 d80d1bd7-1293-4319-8bc8-6ccf78e70104 I was planning on getting the SNES Classic, hoping that Nintendo had figured out that people actually wanted this thing and they would make enough to meet demand. I don't know if Nintendo will be able to meet demand (although that seems doubtful based on the preorder issues), but the main problem for me is that I live in Quebec. For the moment, the SNES Classic will not be available here because it does not meet Quebec's language requirements. I'm not going to get into details about that here, but suffice to say, it's unfortunate. I was planning on getting the SNES Classic, hoping that Nintendo had figured out that people actually wanted this thing and they would make enough to meet demand. I don't know if Nintendo will be able to meet demand (although that seems doubtful based on the preorder issues), but the main problem for me is that I live in Quebec. For the moment, the SNES Classic will not be available here because it does not meet Quebec's language requirements. I'm not going to get into details about that here, but suffice to say, it's unfortunate.

This situation has prompted me to venture into the world the the Raspberry Pi and Retropie. I had heard a lot about the Retropie in the past, but I never really felt like investing any time or money into it. The SNES Classic situation has changed my mind and I decided to order a Raspberry Pi 3 kit from Amazon.

So now I'm just waiting for my Raspberry Pi to arrive and I'll get started with the wonderful world of retro gaming emulation. I'm hoping to write a few posts about my experience and talk about it on the podcast.

Podcast Workflow Fri, 25 Aug 2017 22:15:00 -0400 084a7d88-243c-48d8-93ff-d2b9fc01d659 Starting a podcast can seem like a daunting task. Once you figure out what you're going to talk about, who is going to be on the show with you, and what hosting service you're going to use, you need to figure out how you're going to record it. Starting a podcast can seem like a daunting task. Once you figure out what you're going to talk about, who is going to be on the show with you, and what hosting service you're going to use, you need to figure out how you're going to record it.

In my previous life, I was in a band and recorded a lot of music in my parents' basement. I already had a Shure SM58 microphone and the necessary accessories. It's not the best microphone for podcasting, but it sounds great and I didn't really feel the need to spend money on a new one. A podcast is all about your voice coming through clearly; the sound doesn't have to be perfect but it should be clear, and a separate microphone, be it XLR (like the SM58) or USB, goes a long way. Since I was using an XLR microphone, I did need to buy an audio interface. I chose the TASCAM US 2x2 based on Jason Snell's recommendation. I had used a TASCAM Portastudio for years before I started using a Mac to record music, so I knew I could trust the brand. It works well, and it provides more than enough gain for my voice to come in loud and clear. Ideally, you want to hear what you're recording, so I picked up the TASCAM TH-02 headphones. The price was right, but I find that they're a little too big. I may eventually replace them with something more comfortable, but for now, they're fine. A lot of confusion for new podcasters seems to revolve around recording equipment, but I was lucky enough to have familiarity with that part, so I was able to get past it quickly. My biggest questions were about the logistics of getting it all together.

Brian's previous experience podcasting was a show that was recorded in person, with two other people. Double Density needed to be different, and we were going to have to record a "double ender." That is, we would each record our side of the conversation, and then one of us (Brian) would stitch it all together and produce something that people would want to listen to. To actually hear each other, we use Google Hangouts, although Skype would work just as well. We don't use that audio in the final recording, so it's fine if we hear some audio artifacts while we record. We chose Hangouts since Brian had had a better experience with it vs Skype. It works, and as long as you can hear your co-host, what you use doesn't really matter all that much. I do think that if you're going to have guests on often, Skype may be a better fit since it is more widely used.

In our initial recordings, I used GarageBand to record my end of the show, and then I would share a high quality MP3 through Google Drive. However, I really didn't like recording directly into GarageBand. It created huge files that I really didn't want to deal with. I wanted something a little more versatile, and a little less bulky, for the lack of a better word. After doing a lot of research and seeing what a lot of my favourite podcasters used, I settled on Audio Hijack by Rogue Amoeba. The great thing with using something like Audio Hijack, is that I can create three audio files at once in case something goes wrong. I have it set to record my voice directly to a high quality audio file, as well as recording the entire call to a separate file. I'm also recording what I hear Brian saying in a separate file without my side of the conversation, so that if something goes horribly wrong with his set up, we have a backup. If you listen to the show, you know I'm all about backups. In the 17 episodes I've recorded with Audio Hijack, it has crashed once on me, and it still saved the audio up until that point. If you're willing to spend a little money on some software, Audio Hijack is where you should spend it, if you have a Mac.

For now, I'm not editing the show, as Brian's previous experience editing a podcast comes in handy, and he's able to provide a quicker turnaround. All those fun bumpers and extras you hear on the show are his creativity. We're hoping to share the editing duties soon, and once that happens, I'll definitely be posting about it here. For now, I hope you find this peek into my workflow helpful, and if you have any questions, please let me know.


Four tips for recording your Unknown Flying Object (UFO) Wed, 23 Aug 2017 17:00:00 -0400 86474387-0879-4151-99b6-097f202f800e Four tips that will allow you to better record your Unknown Flying Object experience. Picture this:
You’re out and about with some friends on a perfectly warm summer night. You might be out taking in a local festival, or at a bar or restaurant. You part ways and head back to your car and look up, on a whim. Something catches your eye above; at first you think it’s a satellite or an airplane. But it starts moving in weird ways. Perhaps it moves in a zig zag pattern, or darts back and forth in an unreal way. Your curiosity is piqued and you pull out your iPhone to record this miraculous event.

The scenario above is not uncommon in this modern age, where everyone is armed with a smartphone and can record things at the drop of a hat. With that technology, however, comes some important guidelines to properly documenting your experience. Many people who claim to document experiences don’t take the time to properly make sure that what they film isn’t a frenetic, loud mess that leaves more questions about its creator instead of what's on screen.

Here’s our advice to you, cosmos watcher, on how to not screw up what may be the most singular event of your life. Without further ado, we present…

Four tips for recording your Unknown Flying Object (UFO).

1. Don’t zoom in too much

Trying to keep your object in perspective can be hard when it’s flying all around the sky. Fight the urge to continually zoom in and out in order to capture your object. This will just make the viewer nauseous, and will render whatever you capture useless. Instead, try to stick to a specific distance in order to better contextualize what you’re capturing, and may also allow you to use other objects in the frame for perspective. This will also ensure that the phone tries to autofocus as little as possible.

2. Lie your phone on a flat surface

A lot of cameras on smartphones are now armed with image stabilization, which means that there is less overall shaking when you’re moving and recording video. But image stabilization can’t save your recording alone; lying your phone on a flat surface (ie. the hood of a car, a fence, a table, etc.) allows for less jerky movements and a smoother video experience. Heck, even the use of something like a selfie stick (ugh) means greater control over your hand movements and as thus could be considered an improvement over your bare hands.

3. Don’t yell

This one is kinda self explanatory. You will want the people who view your video to yell out in excitement, surprise and wonderment. Allow them to do that – don’t join in prematurely. Shrieking or whooping while filming only distracts from the evidence you’re trying to present, making your video way less appealing to a viewer. An added bonus: staying quiet allows your camera or smartphone to pick up ambient tones. If your flying object is silent, then this is the best way to let the viewer hear it.

4. Don’t use your flash/camera light

Using your flash or camera/phone light will flood the immediate area around you with light; chances are that the object you’re recording is much further away from you than the light can reach, making its usage detrimental. Let the object stand on its own. Unless you have access to a far-reaching spotlight… Then perhaps this is worth your effort. Otherwise, leave the flash/camera off and pick up the object as is, instead of capturing a sheet of white light that has nothing to do with the object of your desire.

With these four simple tips, we can promise you that you stand a much better chance of capturing your UFO. Who knows, you may even go viral.

Double Density Dot Net Sat, 19 Aug 2017 15:00:00 -0400 dc60bf3b-16bf-48e8-93dd-80c48d44dd06 We registered a domain and we switched hosting service from SoundCloud to Fireside. Double Density has become one of the highlights of my week. From putting together the show outline, to recording the show with Brian, and then getting it out there for people to listen to, it's something I look forward to working on. Something missing though; a true website. Whenever we shared an episode, we would point people to our SoundCloud page, which was fine, but we didn't feel like it was our own. Our FaceBook page was another place we could call our home, but it felt more like a coworking space - a bunch of podcasts sharing an office. Again, totally fine, but it didn't exactly feel like home.

Once SoundCloud started having it's well documented issues, Brian and I decided that it might be time to move to a new host. The decision wasn't easy, since we have a bunch of time left with SoundCloud that we can't get a refund for. We decided that didn't matter, and on August 17, we moved our podcast to Fireside. It's a service that was started by podcaster Dan Benjamin of the 5by5 podcast network. He knows what podcasters need in a service, and it's an all in one package. Part of that package is this great website. We bought a domain, and here we are.

What I am most looking forward to is being able to point listeners to a website that has all of our episodes, with show notes, and with extra content that we'll publish from time to time. Just being able to say, "go to double density dot net and listen to the show" will feel liberating. That's not even taking into consideration a lot of the features that Fireside offers for content creation.

Our RSS feed for the podcast has already changed as of episode 17, and we're planning on fully launching with episode 18. Once that happens, and you find your way here, please let us know what you think. For now, Fireside doesn't offer too much in the way of customization, but that's okay because new templates are coming. The current look is nice and clean and I would prefer they make sure that the most important parts work perfectly before allowing for more customization.

Thanks for listening.