Typical K-II Interactions

This video was a good example of a typical, informal investigation using a K-II meter.  The video was long – over an hour and a half – so I didn’t watch the whole thing.  However, you could learn a few good things in the first five or ten minutes.

Here’s where it was, at YouTube.


The video seems to be gone now. (That’s true of a lot of “investigation” videos from when the ghost hunting trend peaked. )

I’m leaving this article online for two reasons:

  1. The video might return…? Maybe, but probably not with that URL. And it may be another spam effort. I’m not sure.
  2. More importantly: my summary, below, may explain the patience ghost hunters need. You can sit for hours with nothing happening.

First of all, this video showed how imperfect real-time communication is with any EMF meter, but especially a highly sensitive meter like the K-II.

There were times when the lights flickered so quickly, it was difficult to tell whether it flashed just once (for “yes”) or two or three times.  In fact, at least once, a team member said he didn’t see it, when the light had flashed quickly.

This video also provided a vivid example of how tedious ghost hunting can be, particularly when you’re focusing on one specific research technique or tool.  Really, by the 47 minute mark, one of the investigators asked, “Is the fourth letter of your last name between the letters A and L?”

Wow.  That’s a very patient investigator.

You might ask, “Why not use a Ouija board, instead? It’s faster.”

The answer is personal safety.  The more people physically connect with the energy – like with a glass or platen that points to letters –  the more risks they’re taking.   With a tool like a K-II – one that requires no physical contact with the device – dangers are reduced.

The K-II results in this video could be pretty good.  I really wanted to like it and give it a very favorable review.  However, I had some major doubts.

The TV

My first concern when using a K-II is variable, environmental electronic energy.

Right away, I saw the TV in this video’s background.  Is that enough to cause normal EMF fluctuations?  Unlikely, but I wouldn’t rule it out until I’d checked it carefully. I’m not sure the guy in the video did that.

The cat

At times, a cat was on the bed where the K-II was.  I’m not too worried about that because I saw no reaction from the K-II when the cat was nearby.  Also, one of the researchers seemed to sit on the bed with enough vigor that the K-II moved around, but the K-II didn’t react to that, either.

The fan

The rotating fan in back of the EMF meter was a greater concern.  I thought I noticed more flashes after the fan moved to the far left and had just begun the return motion, but I wasn’t sure. (I’m still not sure.)  I’d definitely want to study some freeze-frame shots when the K-II is flashing.

Response synchronicity

I casually checked the frequency of the K-II responses.  In the first five minutes, the timing seemed odd.  In a spot-check near the beginning of the video, I noted K-II flashes at these times:

  • 1:21
  • 2:21
  • 2:28
  • 3:20
  • 4:20

In other words, the K-II was flashing about once a minute, always around the :20 or :21 mark.  If that pattern continued – or even repeated sporadically – I’d discount all of those flashes.

However, the 2:28 response was anomalous and fairly strong, so I’d be more likely to take that response seriously, if no other strong flashes sync with it near :28 marks.

That is the kind of analysis that researchers must do, in more formal investigations. On the other hand, this looked like a very informal investigation.

If I were analyzing this video as part of a formal investigation, I’d be concerned about the TV and the rotating fan.  Also, I’d wonder what else was in the room – or near enough to affect a K-II – that we don’t see in the frame of the video.

And, finally, the biggest credibility issue connected with this video was how it was uploaded to YouTube.

Keyword stuffing

In a misguided attempt to attract more viewers, the foot of the video description was stuffed with keywords that weren’t related to ghosts, such as “epic funny Santa Claus prank Christmas pranks bloopers,” “50 Cent The Voice” and “make money free cash” and “Black Friday Walmart black Friday.”*

I suspect the research team received bad advice about that tactic.  Please, don’t stuff keywords if you want to look like a serious researcher. (On the other hand, if you main goal is to boost your numbers to look popular or earn more money from your YouTube videos… Err, umm, no… what am I saying? That’s never okay.)


All in all, this was a good video to learn from.  And, the results might be impressive in a different context.

If this were one of several supporting investigations related to a single, haunted site, this might be good, but I’d need far more compelling evidence.

For starters, I’d like to have seen a detailed analysis of the video, especially related to the rotating fan and the timing issues.  Without that, there were too many red flags to trust the results. Also, it would have been simple to eliminate most or all of them, in a follow-up investigation, if they were serious researchers.

Originality  (Doesn’t really apply. It’s a K-II meter.)


Credibility (The results were pretty good, but the context — especially the timing issue and the keyword stuffing — were huge red flags as far as I’m concerned, and made the entire effort look questionable.)


* No matter who tells you that keyword stuffing is a good idea to get more YouTube views, don’t do anything like the screenshot below.  It looks spammy, reduces your credibility, and… really, do you want people finding your serious, ghost hunting video using search terms like “prank ghost video” or “swimsuit boys dance gangnam style”?


You may also like: