All posts by Green Eye Tactical

Former Army Tier 1 Operator, Trainer, Dad

The Tactical Industry and the Erosion of Constitutional Rights

The Tactical Industry and the Erosion of Constitutional Rights

Earlier this week, a Facebook post by the Oregon City Police went viral and resulted in severe backlash to the knife company Benchmade.

Right up front, I want to restate, as I have in the past, that Green Eye Tactical trains RESPONSIBLE citizens. That is how I look at it. I don’t care if you are civilian, LE, Fed, Military, or whatever we will offer the same high-level training. I provide no training for LE or Mil that I would not offer to civilians.

I base my position in the issue on the Constitution and its original intent. Period. I’m not a “but”er. You either believe in the Constitution, as written, within the purpose and frame of when the founders wrote it, or you don’t.

I do have a high degree of respect for our Law Enforcement Officers. They do a completely thankless job where they are underpaid, underfunded, undermanned, undertrained and under-equipped. And it is OUR fault as tax paying voters and citizens. However, you are volunteering for a job where you are enforcing laws and protecting rights- and all that entails.

Just as there are people who I would not train, there are departments and agencies that I will not teach. One of my local departments are on my blacklist due to their ingrained positions on engaging someone for the mere presence of a firearm, whether it was legal use or not. Everyone has to search themselves and determine where they stand and what lines they will not cross. That’s mine. Your lines may be different, and that’s fine. It is your choice. /rant off.

Back to the issue at hand, in this post, the department thanked Benchmade for assisting in destroying confiscated firearms, in the Benchmade shop. There were also pictures of the destruction posted.

A picture containing person, photo, ground, showing

Description automatically generated

This post quickly spread on social media pages and has resulted in numerous boycott campaigns. The new leading 2a advocates, the Firearms Policy Coalition (If you aren’t following and supporting this outfit- you need to look into it), even posted a screenshot of their FOIA request to Oregon City Police.
A screenshot of a social media post

Description automatically generated

So, why is this such a big deal?

Well, if you have been following the news lately, there have been massive encroachments into many Constitutional rights across the country, including, in this case, the Second Amendment. Recently, Oregon narrowly passed their Red Flag laws- along with some other states. These laws allow for the immediate confiscation of firearms, without due process. These laws are a very sensitive issue in the industry today.

When we put things together and view the totality of circumstances, we see a police agency in a red flag law state destroying confiscated firearms. We have a popular company in the industry aiding them (interestingly, committing a federal felony in one of the pictures). This insight can give viewers the perception that the Police may have taken these firearms without due process in one of these red flag cases.

That might sound like a bit of a leap in logic, mixed in with assumptions- but there is more to the story.

As with most things, when someone calls attention to something, that thing gets looked into:https://www.opensecrets.org/orgs/summary.php?id=D000047693&cycle=A&fbclid=IwAR0YxWklPFmV9uCF7Cf2I4SvWyy0Ant_Y55ccxB6op1-7zP0LUO-jUQJQRI

It looks like starting in 2012, Benchmade increased political spending. They also ceased contributions to Conservative candidates and solely donated to Democrat candidates.

Benchmade donated to two primary state officials in 2017-2018.

The first is Rep. Kurt Schrader. He is the rep for Oregon’s 5th Congressional District. He is not a pro-2a official. In 2012 he was quoted by The Oregonian as saying:

“However, while the immediate focus will inevitably be to ban certain firearms and magazines, the problems we have are more deeply rooted in American society.”, after the Connecticut school shooting.

If that 2012 date looks familiar, that is where Benchmade’s political contributions switched from small donations to Conservative candidates, to more substantial contributions to Democratic candidates.

The second contribution is even more puzzling than the first. That donation was to Senator Martin Heinrich of NEW MEXICO. I scrubbed through all of Benchmade’s business listings and couldn’t find any business interests in that state.

Martin Heinrich is an interesting name. He not well known, but he was one of the names in the hat for Hillary Clinton’s Vice-Presidential candidate for the 2016 election. This guy has MUCH scarier views on the Second Amendment.

In 2018 the Albuquerque Journal as saying:

“It doesn’t matter if you have a folding stock, it doesn’t matter if you have a pistol grip … but it does matter how the action cycles and how fast they (the weapons) can throw lead downrange, and how many times you can pull the trigger before you’re out of ammunition, Those are the functional things that are at the heart of what makes these (guns) so dangerous compared to other firearms.”

Yeah, this guy wants to forget targeting cosmetics and re-define language restricting firearms to function, capability and effectiveness.

Read the interview to grasp the full extent: https://www.abqjournal.com/1153462/heinrich-hopes-to-influence-gun-debate.html

So, now when we look at the totality of circumstances, we see that Benchmade changed politics back in 2012. They supported two leading candidates, both anti-2a Democrats, one of which isn’t even in their state.

Oregon passes controversial Red Flag confiscation laws, and these laws are a very hot button topic in the industry today. Then Benchmade’s local police department asks Benchmade for help in destroying confiscated firearms, and they happily comply.

Both Oregon City Police and Benchmade have posted replies where they apologize or walk back the posts. I think that they are very telling also:
A screenshot of a cell phone

Description automatically generated

A screenshot of a cell phone

Description automatically generated

First off, the Oregon City Police’s wrote the post craftily. Pay attention to:

“We receive guns that are turned in from community members that they no longer want the guns and want them destroyed. We also have guns that are evidence and when a case is adjudicated the guns are ordered by the court to be destroyed.”

This post is a broad and general statement. It does not say “the guns in this picture” or “the guns in this post.” To excuse the pun, this raises some red flags immediately to me.

If criminals used these firearms in violent felonies (robbery, assault, etc.) I think most people wouldn’t be too upset. Equally, it is the property owner’s choice to do what they want with their property. So, if some gun owner doesn’t want their firearm anymore and takes it willingly to the police to have it disposed of, then hey, that’s your choice. I may not agree with it, but that’s your right. Regardless, what they are NOT definitively stating speaks volumes, at least to me.

Likewise, if you look at Benchmade’s response above (or in the gallery of pictures if you are on FB), it is utterly devoid of addressing the issue that has caused this uproar. They simply state:

“Oregon City Police requested the use of specialty equipment within the Benchmade facility to follow these requirements, and as a supporting partner of our local police force, we obliged the request.”

And then they follow up with:

“Benchmade is a proud and unwavering supporter of both law enforcement and Second Amendment rights”

Yet they make zero statements quantifying what “rights” they support, nor do they address the gun confiscation issue. How simple would it be to state:

“We were assured by OCP that all firearms destroyed were used in violent crimes or destroyed at the owner’s request”? Case closed at that point for me.

You can’t hide behind policy or law if it is unconstitutional. You can’t hide behind “I was following orders” if that order was unconstitutional or immoral, period.

I reserved judgment when I saw this issue go viral before I had time to adequately research this issue, which is why my article here is late to the party. However, at this point, I feel that there is enough to make a statement on the subject.

Here it is:

Green Eye Tactical will no longer recommend Benchmade products. While Green Eye Tactical has never been contacted by or provided training for Oregon City Police in the past, they will be added to our Blacklist and will be ineligible for any Training, Consulting, or Advice in the future. If either Oregon City Police or Benchmade make definitive statements that clearly state their position on the pertinent issues (following policy doesn’t count), then we will re-evaluate this position.

It is my hope that other product suppliers, consumers, and trainers in the industry also make their position on the issue clear as well.

Church Safety Webinar

Church Safety Webinar – recorded: 1730 CDT Monday, 20Nov17
*There is no live video- you will see a freeze frame with realtime audio*

Green Eye Tactical hosted a panel of subject matter experts in the hopes of dispelling some myths and preconceptions about training and preparedness. We discussed a wide variety of topics regarding preparing the Congregation and the Church. Originally, this was a closed session, however, after review, we feel the content is general enough for open dissemination. We do not discuss specific Tactics, Techniques and/or Procedures on the internet- so if you have a specific question or comment, please contact the relevant person below.

Subject Matter Expert panelists:

Pastor John Mark Caton, of Cottonwood Creek Baptist Church in Allen, Texas, where he has served since 1995. John Mark earned a Bachelor of Business Administration degree from Baylor University in 1987 and a Master of Divinity with Biblical Languages degree from Southwestern Baptist Theological Seminary in 1992. He received his Ph.D. from Southwestern Baptist Theological Seminary in 2001.
http://johnmarkcaton.com | http://cottonwoodcreek.org
info@cottonwoodcreek.org

Monti Leija of Quiet Professional Group, a former Special Forces soldier, JSOC Special Missions Unit medic, and training/tactical expert:
leijamontgomery@gmail.com

Matt Wilson of Special Operations Contingent Group, a former Army Ranger, Special Forces soldier, JSOC Special Missions Unit heavy breacher and covert method of entry specialist and training/tactical expert:
https://www.soc-g.com
matt@soc-g.com

Carol Wintzinger, owner of Scenic 360, a company specializing in Landscape Design with a security mindset.
Carol@scenic-360.com

T. Edwin Walker of Texas Law Shield, a member of the State Bar of Texas, Texas Criminal Defense Lawyer’s Association, the Harris County Defense Lawyers Association, and is a legal expert on Use of Force law.
https://www.texaslawshield.com | https://www.uslawshield.com
tewalker@walkerbyington.com

Don Oxman of The Oxman Group, a licensed investigation and security consulting firm specializing in cybersecurity:
https://theoxmangroup.com
don@theoxmangroup.com

Reagan Cole of Texas Defense Articulations a TCOLE, DPS, NRA Firearms, Texas LTC Instructor, and Licensed Texas Security professional:
https://www.texasdefenseart.com
Texasdefenseart@gmail.com

Eric Dorenbush of Green Eye Tactical, former JSOC Special Missions Unit operator, NRA Firearms Instructor, Texas LTC Instructor, and training/tactical expert.
www.GreenEyeTactical.com
eric@greeneyetactical.com

The CQB Distance Myth

I often encounter individuals or agencies that have misperceptions about what CQB distance is. Often it drives their equipment choices, training, and preparation. I have even heard other trainers or Internet personalities guffaw at the suggestion that CQB does exist past 25yds. I hear many justify their 10.5” barreled SBR due to the “fact” that they only use it at close range for CQB. 
First, I’d like to clarify a misconception: Close Quarters Battle (CQB) is the tactics and techniques of combat inside structures. The word “Close” in the term often artificially narrows the application of the TTP’s to many, because they get confused and associate it with Close Quarter’s Engagement/Marksmanship. CQE/M are the marksmanship techniques for quickly and efficiently engaging targets at close distance. 
Do we use CQE while conducting CQB? Absolutely. 
But, does CQB occur at distances where CQE techniques are inefficient? Same answer- Absolutely.
Confused? Think I’m full of crap? Let’s break down what we are talking about. 
Many myths about CQB are due to training limitations and the use of Simmunitions/UTM as a replacement for live fire training. If a person/unit/agency has a live fire shoot house facility, chances are it is fairly limited. Most rooms will be standard size and it might have a hallway that pushes 20-30 yards if they are lucky. Many LE agencies may only have a firing range that goes out to 50yds. So, if that is the world we train in- we begin to adapt to it. We see people setting up their zeros to the training environment, we see tactics geared towards success in that environment, and we see perceptions develop based on that environment. 
Now, what happens if we place this same methodology that I just laid out in a practical scenario? Let’s use a hypothetical SWAT team as an example. Keep in mind the distance they have shot at and the training environment they are used to. Our SWAT team gets called out to an active shooter situation at the local High School. They conduct an emergency assault and enter into the main entrance. For modern High Schools (which are massive), how does the entry foyer compare to the standard shoot-house room? What distances are involved here? Now, let’s say they enter the main corridor (hallway) to begin clearing classrooms. For central access corridors in the same large High Schools, what length are we talking about? What about a hotel hallway? Sports Stadium? Movie Theater? 
Getting the picture now? All of these distances can be well in excess of the 25-50yds that has become our comfort zone. “But that’s not CQB”, you say. You’re wrong and confused, it is CQB and we use the same techniques to address it. However, CQE techniques may not be appropriate for it. I’m not going to be able to pick up a low percentage shot, like a hostile holding a gun to someone’s head, on the move entering a 200yd corridor like I would entering a 10-15yd room. I may need to use techniques and/or positions that are generally accepted as only for open field, like the kneeling or even the prone. Yep, that’s right, I said prone and CQB in the same sentence. Now, don’t take that to the Nth degree- just use the right tool for the job. Remember, the decision-making process for as to what position we use is generally decided between the stability needed for the accuracy required and the time available for the shot. Do you feel comfortable taking a head shot, in a school, to rescue hostages, at 200yds while moving? If you said yes, I challenge you to go try it.
Now, let’s layer on a bit more to this soon-becoming-a-nightmare scenario and add the equipment used due to the belief in this 25yd CQB belief. The same SWAT team is using 10.5” short-barreled AR-15’s with Hornandy 62 grain TAP ammo. Far-fetched, right? That never happens… And let’s add a scenario- Operator #1 enters at the far end of a central access corridor that is over 200yds in length. He is immediately presented with an active shooter that immediately grabs a hostage as a shield. The active shooter then continues to shoot at people. Let’s assume Operator #1 does everything right- calls for the shot, adopts the position he needs, and applies the necessary fundamentals to take the head shot since that is all that is exposed. Does Operator #1 go home a hero? Nope. Operator #1 had his 10.5” SBR zeroed at 50yds because that’s the only distance they shoot at and/or some instructor told him that a 50/200yd zero is the bees knees because ~reasons~, completely disregarding the mathematic impossibility of a 62gr .224 round with a muzzle velocity of 2400fps having a trajectory that breaks the Point of Aim at 50 and 200yds. Operator #1 put a round squarely in the chest of the hostage because he didn’t know that his Point of Impact was actually 9.4” lower than his Point of Aim at 220 yards. To make things worse, the round failed to expand because it was only traveling about 1,680fps at that same 220yds- so it smoked right through the hostage, the active shooter, and another hostage behind him. 
Let’s summarize this diatribe here a bit:
1- CQB does not = short range.
2- Adapt your tactics, Training, and Equipment to meet your operational requirements, not to meet your training limitations or comfort zone.
3- SBR’s are not the best tool for CQB if you do not have optimized ammo.
4- Be equally capable of engaging at 250yds as you are 25yds.
5- Standardized zeros do not work in the civilian/LE industry due to non-standard barrel length/ammo combinations.

Spiritual Training

Our Spiritual preparedness goes hand in hand with our practical preparedness. The time we spend on the range training is important, but it is enhanced by purpose.

Templar Order patron Bernard of Clairvaux wrote that the Templars were “a fearless knight, and secure on every side, for his soul is protected by the armor of faith, just as his body is protected by the armor of steel. He is thus doubly-armed, and need fear neither demons nor men.”
Our modern tactical industry is enamored by ancient warrior culture. Frequently we will see a brand or a person displaying a graphic of a Spartan or Viking. Perhaps they might display a phrase like “Molon Labe” or “Deus Vult”. But, are we associating ourselves with these things because we aspire to be like their originators or is it just a fad that you have hooked onto to reinforce false bravado?

But do those who display shirts and patches depicting these ancient warrior notions truly grasp the spiritual commitment that they demonstrate? The Book of Daniel, 3:17-18 lays out a clear bar for the depth of this faith, “17 If it be so, our God whom we serve is able to rescue us from the furnace of blazing fire, and He will rescue us from your hand, O king. 18 But even if He does not, let it be known to you, O king, that we are not going to serve your gods or worship the golden image that you have set up!”.

Just to be clear, this is faith that is strong enough to be ok with walking into a blazing fire whether there was divine intervention or not.

1 Samuel 17:45-47 further displays the level of commitment: “45 David replied to the Philistine, “You come to me with sword, spear, and javelin, but I come to you in the name of the Lord of Heaven’s Armies—the God of the armies of Israel, whom you have defied. 46 Today the Lord will conquer you, and I will kill you and cut off your head. And then I will give the dead bodies of your men to the birds and wild animals, and the whole world will know that there is a God in Israel! 47 And everyone assembled here will know that the Lord rescues his people, but not with sword and spear. This is the Lord’s battle, and he will give you to us!”

Previous to this passage, David declined ACTUAL armor and fearlessly committed to single combat and opponent that had struck fear in the hearts of the Israelites. He instead took up what is the armor of God that is defined by Paul in Ephesians 6:10-17. “10 Finally, be strong in the Lord and in his mighty power. 11 Put on the full armor of God, so that you can take your stand against the devil’s schemes.12 For our struggle is not against flesh and blood, but against the rulers, against the authorities, against the powers of this dark world and against the spiritual forces of evil in the heavenly realms. 13 Therefore put on the full armor of God, so that when the day of evil comes, you may be able to stand your ground, and after you have done everything, to stand. 14 Stand firm then, with the belt of truth buckled around your waist, with the breastplate of righteousness in place, 15 and with your feet fitted with the readiness that comes from the gospel of peace. 16 In addition to all this, take up the shield of faith, with which you can extinguish all the flaming arrows of the evil one.17 Take the helmet of salvation and the sword of the Spirit, which is the word of God.”

I would pose to you to consider these thoughts and how it affects your readiness in life events. Preparedness does not just mean physical, it is also spiritual. Someone who is equipped for both will have a clear advantage.

Comparative Study of Red Dot Sight Parallax

I’d like to thank you for viewing this report. It took a considerable amount of time to finish. I would like to apologize that parts of it may not appear formatted properly in your browser. Depending on what browser you are using- some of the images may not be visible due to the .tiff format. The original report is a PDF. I used a simple plugin to convert the document to website HTML.

If you would like to download a PDF copy of the report or prefer reading it on paper, you can download it here:
https://www.dropbox.com/s/5zgsq2kq6jri8bd/Red%20Dot%20Test%20Report%20.pdf?dl=0

There is a short video of 4 of the optics tested, demonstrating the report findings here:
https://www.youtube.com/watch?v=81X4dWcIM5c

A buddy that helped out with the data compilation actually made an interactive website where you can play with the results:
https://public.tableau.com/profile/largo.usagi#!/vizhome/optics/OpticStory

If you would like to replicate this test on your optics- you can download the complete test protocols and forms here:
https://www.dropbox.com/sh/obpot49wnlitcql/AAC6XpeK8SZq9QLBXjxXthzba?dl=0

Enjoy the report

While you read the report- check out the parallax video at : https://www.facebook.com/GreenEyeTactical/videos/1537266282979623/

Comparative Study of Red Dot Sight Parallax

Eric Dorenbush

Green Eye Tactical

Brad Sullivan

Xxxxxxx

March 2017

Author Note

Eric Dorenbush, Owner/Instructor, Green Eye Tactical

Contact: eric@GreenEyeTactical.com

Brad Sullivan, xxxxxxxx, xxxxxxxxxx

Contact: Largo.Usagi@gmail.com

This study was supported by volunteer efforts and received no outside funding sources.

** THE RESULTS IN THIS REPORT AND THE ACCOMPYING ANALYSIS AND/OR EDITORIAL ANALYSIS ARE SOLELY BASED ON THE SAMPLE GROUPS TESTED. WHILE IT MAY BE POSSIBLE TO DRAW INFERENCES TOWARDS AN OPTICS MODEL IN GENERAL– NO DATA, ANALYSIS, COMMENTS, OR STATEMENTS IN THIS REPORT SHOULD BE CONSIDERED AS A STATEMENT TOWARDS ALL OPTICS IN GENERAL AND ARE ONLY BASED ON THE SPECIFIC DATA COLLECTED AND THE SPECIFIC MODELS TESTED. **

Abstract

This report is designed to evaluate the possible aiming dot deviation in red dot optic sights due to parallax effect, resulting from angular inclination or deflection in user view angles through the maximum possible angle of view on the horizontal and vertical planes. The report consists of two separate testing events: one held by the author of this report, and another from various independent volunteers across the country that replicated the test and submitted data from their observations. The results of the data collection reveal interesting trends that conflict with the commonly held notions that all red dot sights are equally susceptible to parallax induced Point of Aim deviation.

Introduction

Editorial: During my years as an instructor, I have noticed a variance in the sensitivity of certain optics models with regards to Point of Impact (POI) shift due to inconsistency with the shooter’s head position and alignment behind the optic. This issue has arisen repeatedly in a specific course I teach called the Tactical Rifle Fundamentals Course (TRF). During this course, shooters are trained in marksmanship fundamentals by instructing them in various concepts that can affect the two firearms tasks: One– Properly point the weapon, Two– Fire the weapon without moving it. Obviously, there are several errors a shooter can induce that fall under one or both tasks. After data collection and custom zero development, shooters begin grouping at their custom zero distance. Depending on their barrel characteristics, measured muzzle velocity, atmospheric data, and bullet characteristics, this distance can vary generally between 24 yards to 56 yards. After initial instruction, shooters begin to fire 5 round groups at circular bullseye targets. After each group is fired, shooters move down range to the targets, and I conduct a debrief of each target, issuing feedback and sight adjustments as necessary– with the goal to increase the consistency of their grouping and to eliminate the shooter induced error. This is often a troubleshooting process that can take consecutive groups. Shooters continue this process at 100 yards, 200 yards, and 300 yards in various standard shooting positions. At the very first TRF course I taught three years ago, I noticed difficulty in completing this process by shooters using one particular optic model. The shooters using these optics produced a POI shift each group fired that could range from a quarter inch to two inches. After exhausting all mechanical error possibilities, I attempted to fire a group with one of their weapons. While adjusting my head position and assuming the prone supported position that they had been using, I noticed some irregular movement with the aiming dot in the optic. I then checked the optic by moving my head directly vertically, behind the optic while keeping the weapon immobile. It was then that I noticed that the aiming dot moved in a circular arc that was not in an axis directly equal to or opposite of my view angle. Without telling the rest of the class what I saw, I asked them to perform the same check and, when satisfied- to stand up without saying anything and let each of the other students perform the check. After everyone had completed the check, I asked the class what they had seen. They all described, independently exactly what I saw. We then performed the same actions with the other student’s optic of the same model with the same results– except for one thing– the optic moved in a completely different arc. Over the next three years, I saw many of these optic models. Without exception, every user of this optic demonstrated POI shift; every optic displayed irregular and excessive aiming dot deviation when observed; every student in each class confirmed the same observations; and, every optic displayed a different arc pattern of movement. Again, this was without exception. Over these years as an instructor, I reinforced to users of these optics at the TRF course the need to keep the aiming dot in the center of the viewing tube. I proposed the option of referencing the front sight post’s position relative to the aiming dot as a spatial reference when firing (not placing the dot on the front sight post, just noting the position relative to it) to increase consistency. Often this minimized the POI shifts, but it never eliminated it. After three years of observing the same issue with this sight and the effect it had on my client’s ability to progress with the rest of the class, I decided to disallow its use in this one specific course. I made this announcement on my company’s relevant social media pages. Shortly after sharing this equipment restriction, it was shared to an industry forum by an unknown user. My statement stirred some emotions with some in the industry. While this announcement was not an industry press release or advice to anyone in the industry, as it pertained to just one of my courses- the reaction was that it was.

In my assessment, this misunderstanding exposed a major issue in our firearms community. We have evolved, generally, into an echo chamber where broad statements and opinions drive the validity of equipment. After extensive research, I also realized the severe lack of any independent and peer reviewed testing. To a large extent, most simply take the word of manufacturers– or others take the word of another person or organization that say, “they saw it happen”. So, while my announcement was taken out of context as industry advice, I realized that it was an opportunity to, at the very least, try to do something right. Ideally, as a community, no one should react emotionally to someone who states a point of view on a piece of equipment or on a theory that differs from one’s own. One should simply request that the person substantiates their view by producing data that can be reproduced and verified. One should never discourage someone in this undertaking, for if one knows that one’s point of view is correct– then the tester’s data will prove it. If the tester’s resulting data or model is believed to be flawed, then one can, in turn, reproduce their test and demonstrate it to be inaccurate. This is the basis of the Scientific Method. Society can then keep each other honest in what individuals say because statements will need to be supported by evidence that can be replicated. Society would be improved if more people approached controversial topics in this manner. Therefore, I have committed to this endeavor by dedicating a portion of my time, without pay or incentive, to produce a test, the resulting data, and this report for the community to review. It is my deepest hope that it will drive others to reproduce what I have done, in order to either prove or disprove my results. I want to thank all who are reading this, and I hope that readers find this report to be interesting.

Eric Dorenbush

Green Eye Tactical

Methods

This report consists of two separate testing events:

  • The first test was conducted on 11 March 2017.
  • Follow up tests were performed independently by users across the country and were then submitted electronically.

The following section defines the initial test performed.

Testing Plan (Original Test, 11 MAR 2017)

Purpose:

To measure and compare data on red dot sight aiming dot deviation, due to parallax, in various models of optics at variable ranges.

Method:

Users induce angular deviation in their angle of view, from one extreme to another, in order to replicate head position inconsistency and measure the maximum possible parallax deviation.

Goals:

This test is intended to:

  • To objectively evaluate each device
  • To establish specific controls to ensure consistency between testers and devices
  •  To establish clear protocols to ensure repeatability and ease of peer review
  • To comment only on confirmed observations, not hypothesize as to the causes

Calibration Targets:

/Volumes/Office 2/Pictures/Photos Library Primary.photoslibrary/resources/proxies/derivatives/c2/00/c243/N2vUfjMTRECmBaA9IiPvQg_thumb_c243.jpg

** Note: The picture above was not taken at a perpendicular angle to the target face. Due to perspective, the target is not measurable in this photograph. **

Calibration targets were constructed on white corrugated plastic target backers 24” wide x 44” tall. The lower 6” portion of the target consisted of 8 black vinyl circles that ranged from 0.5” to 4”, in 0.5” increments. The upper target area measured 38” x 24” and consisted of a marked black grid line pattern with 1” line spacing. The central horizontal and vertical lines were red. At the center of the target, was a 4” black vinyl circle with a 1” circular hole in the center.

Test Sheets:

Original%20Optics%20Test%20Form.jpeg

Testers were issued a testing form to be used for each optic separately at each distance observed. The concept of the form was that the tester would fill out the relevant data fields in the top margins. The tester would then use a colored pen to draw the path he saw the aiming dot move, relative to the center of the target, on the representation of the calibration target on the test sheet. The tester would use a separate color for the vertical and horizontal movement tests, noting the color used in the color legend fields. The tester would then mark the coordinates for the end points of each line trace. Testers did express some confusion as to the Cartesian coordinate system, as well as the format for an entry that separated the X axis values from the Y axis values. Testers were instructed that they could leave the coordinate fields blank and that the diagram would be used to derive the coordinates from. This format was modified in later tests.

Planned Control Measures:

  • All optics used were to be assigned an ID number, labeled, photographed, and have its serial number and model recorded on a master sheet. All optics would be associated with a tester ID for the purposes of follow up for inconsistencies

../../Desktop/Report%20Graphics/IMG_1472.JPG

  • . All testers will be assigned a tester ID number, which will be used on tester evaluation forms. Testers name and personal data were to be kept on a separate roster and not used on publicly released results to protect privacy. Testers could choose to reveal their identities independently. 
  • If optics were weapons mounted, then they would be cleared and flagged before test use. 
  • Live fire shot groups would not be used for this test to record results, to rule out fundamentals errors as a factor. 
  • All targets used would have its measurements confirmed by the testing group before use. Any inconsistencies will be recorded. All targets will be set up and leveled by a level confirmed by the testing group. Targets will be placed at distances, utilizing a laser rangefinder. 

../../Desktop/Report%20Graphics/IMG_0518.jpg

  • Weapons would be clamped to a rest and the sight will be leveled with a level and confirmed by at least two separate testers before observation. 
  • All testers would be observed by at least one other tester during observation to ensure consistency.
  • To eliminate the potential of the test organizer inducing bias, the testing group would elect a test leader, who had the responsibility of effectively running the test.
  • The testing group would elect a recorder who would manage all testing forms until the test was complete.
  • All weather and atmospheric data at the time of the test would be recorded 
  • All optics would use a fresh lithium battery

Planned Procedures:

  • The testing group would inspect and clear all weapons used during testing, under the test organizer’s supervision.
  • Testing group would inspect each optic, thoroughly clean all optic lenses, and check batteries.
  • The testing group would inspect each optic’s serial number, record the number on a master sheet, and assign an ID number to be attached to each optic for ease of reference during testing.
  • The testing group would install and inspect each calibration target– confirming the precise distance from the testing table to be 25 yards, 50 yards, and 100 yards.
  • Members of the testing group, not involved in the testing of the optics on the table would be sequestered in a holding area, to not be biased by observing other tester’s findings.
  • Testers would conduct the following tests:

    • Point the aiming dot at the 0.5 inch to 4-inch sub-tension circles at the bottom of the calibration target to observe and measure the perceived aiming dot size and variable distances and record the results on the evaluation sheet.
    • Point the aiming dot at the center of the calibration target.
    • Confirm the dot is centered on the calibration target by a second tester.
    • Without touching the weapon or optic, the tester would conduct the vertical movement test by recording the linear measurement of the aiming dot movement, measured perpendicular to the line of sight, relative to and at the intended point of aim, as observed through the maximum viewing angle of vertical inclination and declination.
    • The tester would then conduct the horizontal movement test by recording the linear measurement of the aiming dot movement, measured perpendicular to the line of sight, relative to and at the intended point of aim, as observed through the maximum and minimum viewing angle of horizontal deflection.
    • This process would be repeated at the 25 yard, 50 yard and 100 yard calibration targets

../../Desktop/Report%20Graphics/IMG_0528.jpg

  • A major focus of the planned procedures was to keep them as simple as possible, so that any interested party would not be discouraged to attempt to replicate the test in order to confirm the results in this report or to test their own personal equipment.

Testing Narrative:

../../Desktop/Report%20Graphics/IMG_1491.JPG

The test was held in Whitewright, Texas at a private range facility on 11 March 2017. 6 volunteer testers attended the event. 14 optics were donated for testing by volunteers. Testing began at 0930 and was projected to conclude before noon. The test began well and testers were motivated by the opportunity to contribute to the industry through data collection. It became apparent early in the day, that the protocols used were very time-consuming. By 1130, we had only finished 4 optics evaluations by 2 testers. After consultation with the lead tester, it was identified that the red dot size evaluation portion of the test was over time intensive, especially on the 100-yard target. I relayed to the lead tester, that while this was an interesting point of data to collect, it was not essential to the test’s main purpose. We also set up and additional table to provide for two additional testing stations. The holding area was decommissioned and all testers were moved to the testing line to simultaneously conduct tests. This abbreviated the process significantly, but as we were already late in the day, many testers were moving close to deadlines to depart. After consulting with the lead tester, I communicated the priority to be the 50-yard target first, then the 25-yard target, with the 100-yard target then observed if time allowed as it consumed more time. Unfortunately, due to the unforeseen time involved in the testing, not all optics were able to be observed by all testers, at all distances. It was also found that the testing form was not well thought out and the (X, Y) coordinate fields did not make logical sense. This form was subsequently modified. Testers completed an exit survey and statement before leaving. No tester reported a lack of confidence in their findings.

Definitions:

Often the use of terms can get in the way of explaining a concept or result. The following is a list of terms used in this report that could be misconstrued. The definitions may differ from usage in other areas, however– this is what they are being used for and where their definitions are derived from:

“Parallax is a displacement or difference in the apparent position of an object viewed along two different lines of sight and is measured by the angle or semi-angle of inclination between those two lines. The term is derived from the Greek word παράλλαξις (parallaxis), meaning “alternation”. Due to foreshortening, nearby objects have a larger parallax than more distant objects when observed from different positions……… In optical sights parallax refers to the apparent movement of the reticle in relationship to the target when the user moves his/her head laterally behind the sight (up/down or left/right), i.e. it is an error where the reticle does not stay aligned with the sight’s own optical axis– Wikipedia, Parallax.

The key phrase in this definition is bolded. We are using parallax movement to describe the effect of the aiming point moving from its relative point of aim on a target due to the alignment of the user’s head behind the optic.

Atmospherics:

Temperature: 68F

Humidity: 57.2%

Barometric Pressure: 30.08 inHg

Altitude: 751 ft ASL

Direction of Observation: 180deg Magnetic

Angle of Observation: 0deg

Table of Optics Tested:

Optic Brand

Optic Type

Serial #

ID #

EoTech

EXPS 3.0

A1292043

1

EoTech

EXPS 3.0

A0565568

2

EoTech

EXPS 3.2

A1276811

3

Aimpoint

T-1

W3229523

4

Aimpoint

T-1

W3118077

Not Tested

Aimpoint

Pro

K2980053

Not Tested

Trijicon

RM07

108157

Not Tested

EoTech

EXPS 3.0

A1348665

8

Primary Arms

MD-05

6182

9

Aimpoint

T-1

W3908997

10

Trijicon

MRO

041032

11

Burris

Fastfire III

12

Trijicon

MRO

0611331

13

Trijicon

MRO

010521

14

Testing Plan (Follow-On Testing)

The following section summarizes the independent follow-on tests that volunteers across the country conducted by replicating the modified form of the original testing.

Purpose:

To collect and compare data on red dot sight aiming dot deviation, due to parallax, in various models of optics at variable range for the purpose of validating the data sets and results from the original 11 March 2017 results and to expand the sample groups.

Method:

Users would induce angular deviation in their angle of view from one extreme to another, to replicate head position inconsistency and measure the maximum possible parallax deviation.

Goals:

  • To objectively evaluate each device in an impartial manner. 
  • To establish specific controls to ensure consistency between testers and devices.
  •  To establish clear protocols to ensure repeatability and peer review. 
  • To comment only on confirmed observations, not hypothesis as to the causes.

Calibration Targets:

/Users/ericdorenbush/Dropbox/Optics Testing Forms/Printable Target.jpg

For the follow-on testing, a calibration target was produced that could be printed on standard printer paper. This modification was made in order to accommodate Law Enforcement units with limited time to construct the original calibration targets and military Special Operations Units that may be forward deployed. The calibration targets consist of the same 1” grid line spacing as the original, as well as the 4” black center circle, with the 1” circular cut out.

Test Sheets:

In order to aid and encourage the independent evaluation of the test we conducted on 11 March 17, the original testing form was modified to make the evaluation process more streamlined and easier to understand. All files were then uploaded to a shared Dropbox and the link was shared various firearms forums and on social media.

Instruction Sheets (Procedures and Control Measures)

The following instruction sheet was also added to the folder. Some modifications were made to the instructions to accommodate some special operations units that wished to submit test results, but were forward deployed. Considerations for the sensitivity of releasing serial numbers of optics for some of these units:

Testing Narrative

The follow-on testing was an extremely important component of this testing report. Due to the nature of the testing procedures used in the original test, there was a reasonable chance that the tester induced error could create inconsistent results to the point that the data sets would not be reproducible. The concerns over this made it essential to replicate the test completely independently and by as many separate parties as possible. If results within the standard deviation for similar optic models could not be reproduced by independent testers, then it could invalidate the data sets for the purpose they were summarized. In that case, additional controls would have to be emplaced and the testing would need to be repeated until enough variables were removed until the data sets were consistent. All files and instructions required for the testing procedure were uploaded to a shared Drop Box folder and published on several industry forums and social media sites. As a result of this initiative, a significant amount of data sheets were received that have drastically expanded the type and models in this report, as well as drastically expanding the comparative data point of the original test. These reports have come from a wide array of individuals from a wide variety of backgrounds from the civilian, Law Enforcement, Federal, and Military community.

/Users/ericdorenbush/Desktop/Report Graphics/IMG_20170318_121931442.jpg
/Users/ericdorenbush/Desktop/Report Graphics/IMG_0548.jpg

The results of these data sheets that were submitted from these independent parties are summarized in the following section and compared to the original test conducted on 11 March 2017.

Results

Summary Chart Legend

For all summary charts, the following abbreviations and terms are used:

VD A (IN)

Average of Vertical Deviations in Inches

VD A (MOA)

Average of Vertical Deviations in Minutes of Angle (shooter’s)

VD SD (IN)

Standard Deviation of Vertical Deviations in Inches

VD SD (MOA)

Standard Deviation of Vertical Deviations in Minutes of Angle

HD A (IN)

Average of Horizontal Deviations in Inches

HD A (MOA)

Average of Horizontal Deviations in Minutes of Angle (shooter’s)

HD SD (IN)

Standard Deviation of Horizontal Deviations in Inches

HD SD (MOA)

Standard Deviation of Horizontal Deviations in Minutes of Angle

AVG A (IN)

Average of Horizontal and Vertical Deviations in Inches

AVG A (MOA)

Average of Horizontal and Vertical Deviations in Minutes of Angle (shooter’s)

AVG SD (IN)

Average of Horizontal and Vertical Standard Deviations in Inches

AVG SD (MOA)

Average of Horizontal and Vertical Standard Deviations in Minutes of Angle (shooter’s)

Vertical Deviation

The linear measurement of the aiming dot movement, measured perpendicular to the line of sight, relative to and at the intended point of aim, as observed through the maximum viewing angle of vertical inclination and declination.

Horizontal Deviation

The linear measurement of the aiming dot movement, measured perpendicular to the line of sight, relative to and at the intended point of aim, as observed through the maximum and minimum viewing angle of horizontal deflection.

Inches

Standard linear measurement. 1/12 of a foot

Minutes of Angle

Angular form of measurement, where 1 Minute of Angle (MOA) equals 1/60th of a degree and 1 degree equals 1/360th of a circle or complete turn. True MOA subtends to 1.047 inches at 100 yards for every 1 MOA and is not used for this test. “Shooter’s” MOA rounds the subtension to 1 inch at 100 yards for each 1 MOA. This is used to simplify the math involved, the results, and to reduce confusion in some of the readers.

Average

The Arithmetic Mean as found by the sum of numbers, divided by the number of the numbers.

Standard Deviation

Standard deviation is a measure that is used to quantify the amount of variation or dispersion of a set of data values. A low standard deviation indicates that the data points tend to be close to the mean of the set, while a high standard deviation indicates that the data points are spread out over a wider range of values. The STDEV.S function is used for the calculations on this sheet.

11 March 17 Test Results

Overall Results

Testing Accuracy:

The premise of this test was to evaluate possible parallax error that a shooter could induce into a red dot optic through improper viewing angle, such as may occur due to inconsistent head alignment. One of the objectives of this test was to keep the manner in which the test was conducted as simple and repeatable as possible. While a more realistic way to test for possible error, would be to restrict the view angles to the center 50% of the available viewing area– placing repeatable controls to restrict this angle would have complicated the testing protocols and/or required additional equipment. For these reasons, we chose to use the maximum viewing angles on each axis of movement, as the physical construction of the tube or window in the optic would provide a natural constant for viewing angle tolerances.

Red dot users also tend to perceive and reference the aiming dots differently. This could be due to numerous reasons, such as: how users with astigmatisms may see the EOTech red dots as an ameba or separate dots rather than a single dot, or some Aimpoint dots may appear as an oval. Visual acuity can also affect the clarity of the target the aiming dot is being referenced to as well. Since the premise of this test is based on a user’s perception and ability to use an optic, it was decided to not control this variable. We feel that this makes the test unique, in that: it provides results that may more closely represent results that would be replicated by interested parties that may attempt to recreate the test. An interesting follow-up test that could be done after this test, may be to recreate this test using cameras or video equipment instead to establish a comparative data set against the user’s visual acuity and perception.

Below is a table showing the differences in the tester’s results from the 11 March 2017 test:

../../Desktop/Report%20Graphics/tester%20deviation.jpeg

As you can see, there are variances in the results that the testers reported. What is very interesting, is that some testers reported results that were far outside the average of the other testers. However, the variance was not constant across all optics. For instance, Tester ID # 1 reported results that were far outside the average for the Trijicon MRO’s, however, the same tester was well within the average for the Burris Fast Fire and EOTech EXPS 3.0, and only slightly greater than average for the T-1.

../../Desktop/Report%20Graphics/tester%20sd%20all.jpeg
When we add in the tester standard deviations from the independent testing that occurred after the 11 March 2017 test, we can see that as the test group increases, the variance in results remain fairly consistent– tester ID # 1 on the MRO being the sole outlier. The expansion in the sample group compared to the consistency of the results seems to indicate that the original testing group’s standard deviation in data point displays a trend to be consistent with expected results.

One of the factors of the test that minimizes the effect of results that may fall outside of the margins is that we use multiple data points and present the results as averages. This results in a single large variance in data producing only a minimal shift in average results. Another statistic we included in the comparison charts that will be displayed is the Standard Deviation of the results. This gives the reader an idea of how much variance there is in the data points for the averages displayed. As the data is displayed at varying levels and orders, the reader can use the comparison

Chart Explanation

The overall section summarizes the testing data from all distanced (25, 50, and 100yds). Data sheets are entered a master spreadsheet, which can be found in the public files for this report, and results are calculated. Summary tables are sorted from better results to worse results and color coded for clarity.

The values in the “Data Points” column represent how many individual tester sheets were used to produce the results given. The “VD A (IN)” and “HD A (IN)” columns represent the average of all linear distance measured from their associated data points. The linear distance was measured by taking the line trace end point coordinates using the Cartesian Graph format (x1,y1), (x2,y2). The Cartesian version of Pythagoras’s Theorem was used to find the Euclidean Distance between the two points, using the formula:

Report%20Graphics/slide_4.jpg

The “AVG A (IN)” and “AVG A (MOA)” Columns are a simple average of the Euclidean Distance from their corresponding deviation columns, i.e.: AVG A (IN) is the average of the VD A (IN) and HD A (IN) fields.

The Standard Deviation fields, “VD SD (IN)”, “VD SD (MOA)”, “HD SD (IN)”, and “HD SD (MOA), are not calculated from the summary charts pictured here. These values are calculated from the Euclidean Distance values from each individual optic. The average Standard Deviation fields, “AVG SD (IN)” and “AVG SD (MOA)” are averages of the corresponding Vertical Deviation and Horizontal Deviation results.

The color scheme on the charts represents an “above average” or “below average” measurement, based on a simple average of that column, represented in the “Total Sample Average” row. This means green is a smaller or more precise value and red is a larger or less precise value. The exemption is the “Data Points” column, of which a smaller number of data points is labeled as red and a larger sample size is green. Simply: green is a better result, red is a worse result.

*The movements observed are the movements of the aiming dot, reference the target, looking through the viewing window or tube. The test did not measure if parallax caused the actual perception of the targets position itself to move to changes in viewing angle. The “Vertical” or “Horizontal” descriptor to deviation describes the axis of head movement the tester used to observe the recorded results. Due to many of the optics exhibiting irregular and excessive movement, using the actual movement direction of the aiming dot was not possible. The irregular movement paths also drove the decision to use simple end points for deviation calculation, as using multiple data points along many of the optic’s curving paths would have been extremely complex. *

The “Data Points” column indicates the total number of test sheet data that were used to produce the results shown. Since all testers were not all able to test all optics at all ranges, due to having to leave as the test took much longer to conduct, the Data Points are not evenly dispersed. Keep this in mind for the later comparisons in this report, as remote testers submitted a significant number of many of these low data point optics– and their results will indicate as to whether the smaller sample group produced a flawed model or not.

The following sections below summarizes the results of the data recorded by user evaluations during the vertical movement test.

SUMMARY OF OVERALL RESULTS

Overall Vertical Movement Evaluation:

This table summarizes the results of the data recorded by user evaluations during the vertical movement test, by optic type, and at all distances. As shown, this data is derived from 107 individual test reports.

From the Vertical Deviation results, we can see a clear separation in measurements between the optic models. To further break out these results, we can compare this table to the Overall Vertical Deviation summary, sorted by individual optics tested:

As we can see from the chart comparison, the result within each individual optic type exhibited relatively consistent behavior when compared to the same models. The exception being the MRO’s, where we see that one of the three MRO models, Optic ID # 13, significantly outperformed Optic ID # 14 and 11. These optics, however, still performed in the bottom 50% of the group.

Many of the testers reported very minimal movement observed in the EXPS 3.0s tested, and if movement occurred– it was at the very edge of the viewing window and was minimal.

The MRO was very difficult for the users to diagram what they saw. Most testers reported a diagonal deviation when moving their point of view vertically. The difficult part to describe was when the aiming dot passed back through the target as the tester crossed over the center of the viewing tube. At this center point, the dot would make a sharp movement before continuing its diagonal path. Some users described a “squiggle”, some a lightning bolt, some a waveform. This behavior was observed by all testers on all MROs tested in this test. To demonstrate this, below is a picture of Optic ID #13’s movement diagram at 50 yards, as drawn by Tester # 5. To remind the reader of perspective– the black circle represents the 4” black circle on the calibration target and grid lines are 1” spacing.

Testers reported varying semi circular movement patterns in the T-1s tested. All testers reported this behavior in both T-1 optics tested. An example of the irregular movement observed in the T-1 models is exhibited below in Optic ID #the ’10s 50 yard test form as observed by Tester ID #1.

We also see the formation of a trend, that will continue through the varying levels of breakout detail, that the optics that exhibited less aiming dot deviation due to parallax also produced a better standard deviation of results. The decrease in movement seems to indicate that testers had the ability to more accurately observe and reference these optics.

Editorial analysis: As an instructor, I find the Standard Deviation numbers almost more interesting than the Deviation measurements, as it clearly points to which optics are more sensitive to user error. As it pertains to user error, an optic with a very low standard deviation would indicate consistent error that could be predicted and more easily account for, a very high standard deviation indicates inconsistent error that would be much more difficult to account for.

Overall Horizontal Movement Evaluation:

From the Overall Horizontal Deviation by the model type summary chart above, it is observed that there were only changes in rankings in the upper 50% of the optics, compared to the vertical movement test. Some of the MROs appeared more sensitive to horizontal head movement, while the T-1s were more sensitive to vertical head movement, but this was not consistent.

We can also see the trend of a greater precision of results with the optics that exhibited less deviation.

In the same chart that shows the same summary, but broken out to individual optics tested we can see some interesting trends. For instance: Optic ID #10 displays more consistent standard deviation result of 3.5 MOA in the horizontal test, and a less consistent standard deviation result of 5.27 MOA in the vertical testing.

Editorial analysis: This data, combined with the tester’s verbal descriptions of what they observed, demonstrate how irregular movement paths can cause difficulty with the user’s ability to consistently reference the optic.

Overall Total Deviation Summary:

As we can see from the summary chart there is a wide range in deviations with very large increments between many of the optics models. While we generally accept that red dot optics are all subject to parallax, it is extremely clear that some optics do not succumb significantly to parallax deviation and other optics exhibit more extreme parallax deviation.

SUMMARY OF 25 YARD RESULTS

25 Yard Vertical Movement Evaluation:

../../Desktop/Report%20Graphics/25%20yard%20summaries/25yd%20vertical%20deviation%20by%20optic%20type.jpe

In this chart, we see that 50 data sheets were available and used to produce the summaries for the 25-yard results. While some optics types had fewer data points, they do show a level of consistency within the larger sample size of data points in the overall summaries. We also see a general consistency in the rankings of the optics tested.

../../Desktop/Report%20Graphics/25%20yard%20summaries/25yd%20vertical%20deviation%20by%20optic%20ID.jpe

When we break these results out by specific optics tested, also see some level of consistency compared to the overall summaries of all distances. We do see some minor changes in rankings within the rankings of optics types, as we do with the EoTechs which have changed positions in rankings within themselves– but have retained their overall place within the group by type. We also see MRO Optic # 13 significantly outperform #14 and #11. The Burris remains unchanged in its position, but the T-1s have changed position relative to each other and the rest of the lower 50% of the sample group.

25 Yard Horizontal Movement Evaluation:

../../Desktop/Report%20Graphics/25%20yard%20summaries/25yd%20horizontal%20deviation%20by%20optic%20type.jpe

In the 25 yard summary by optic type for the horizontal deviation test, we see the same general trend towards ranking. The top 50% ranking remains unchanged from the overall results at all distances. The same optics that changed position within their respective types, changed in line with their change in the overall results. It is of note– that the EXPS 3.2 begins to display a worsening degree of standard deviation of results at closer distances.

When broken out by specific Optic ID #, we again see the same trend for positions compared to the summary results from all positions. Again, the EXPS 3.2 displays a worsening degree of standard deviation here as well.

25 Yard Total Deviation Summary:

../../Desktop/Report%20Graphics/25%20yard%20summaries/25yd%20total%20deviation%20by%20optic%20type.jpe

While the average results at 25 yards, by optic type, remain consistent with the rankings in the summary results from all distances in the previous section– the standard deviations do not.

We also see some changes in specific optics rankings within their respective optics types, compared to the summary chart of all distances. The constant we do see is that the optics in the top 50% and the bottom 50% of ranking, remain in their respective brackets.

25 Yard Rate of Change Summary

../../Desktop/Report%20Graphics/25%20yard%20summaries/25yd%20%25%20change%20by%20type.jpe

Another interesting trend is that there is no apparent constant, with regards to any general rule as to whether all red dot optics tested displayed an increasing or decreasing degree of parallax deviation at closer distances than at longer distances. It is generally assumed that red dot optics are more susceptible at closer distances than at longer distances. While we do see that all optics displayed more angular (MOA) deviation at closer distances, there is not a constant rate of change. The optics that displayed significantly less deviation at closer ranges displayed a drastically higher rate of increase. The optics that displayed significantly higher angular deviation at closer distances displayed a significantly lower rate of increase. However, none of the optics displayed any form of a constant of change when compared with other optics models.

SUMMARY OF 50 YARD RESULTS

50 Yard Vertical Movement Evaluation:

../../Desktop/Report%20Graphics/50%20yard%20summaries/50yd%20vertical%20deviation%20by%20optic%20type.jpe

The 50-yard vertical evaluation averages, by optic type, remain consistent with the ranking positions in the similar overall results at all distances. With exception to the Primary Arms optic, which drops to the bottom slot.

../../Desktop/Report%20Graphics/50%20yard%20summaries/50yd%20vertical%20deviation%20by%20optic%20id.jpe

The breakout detail by Optic ID # shows the same marginal changes between rankings between optic types and similar changes in the lower 50% rankings. The Burris remains at a constant position.

50 Yard Horizontal Movement Evaluation:

../../Desktop/Report%20Graphics/50%20yard%20summaries/50yd%20horizontal%20deviation%20by%20optic%20type.jpe

The horizontal deviation table remains almost consistent with the changes observed in the vertical summary changes.

../../Desktop/Report%20Graphics/50%20yard%20summaries/50yd%20horizontal%20deviation%20by%20optic%20id.jpe

The breakout detail, by Optic ID #, again show consistent results with the previous averages with the top 50% remaining constant and the bottom 50% marginally changing rankings.

50 Yard Total Deviation Summary:

../../Desktop/Report%20Graphics/50%20yard%20summaries/50yd%20total%20deviation%20by%20optic%20type.jpe

The 50 yard summary data (of the 47 available data sheets), by optic type in the table above continues to display general consistency with the overall results. We do however see the Primary Arms optic fall in rankings in the vertical deviation table.

../../Desktop/Report%20Graphics/50%20yard%20summaries/50yd%20total%20deviation%20by%20optic%20id.jpe

The by Optic ID # breakout follows the same ranking change trend as the vertical and horizontal chart changes for the 50 yard results

50 Yard Rate of Change Summary:

../../Desktop/Report%20Graphics/50%20yard%20summaries/50yd%20%25%20change.jpeg

As before, we see the lack of a constant rate of change in deviation as we move back in distance to 50 yards, compared to the overall summary of all distances. Here, even though he deviations were minimal, to begin with, we see the EoTechs drastically decrease the percent change in parallax, compared to the other model types that show a very large amount of deviation and a small rate of change. This is an interesting data set that challenges the notion that red dot optics are either parallax free beyond 40 or 50yds or that the amount of parallax decreases with distance. It also challenges the notion that parallax affects all red dot optics in the same way or in any consistent fashion when compared to models.

SUMMARY OF 100 YARD RESULTS

Unfortunately, the 100 yard summary has the smallest sample size. This test was done towards the end of the test day and due to the level of precision in aiming, took significantly more time to conduct.

100 Yard Vertical Movement Evaluation:

../../Desktop/Report%20Graphics/100%20yard%20summaries/100yd%20vertical%20deviation%20by%20optic%20type.jpe

In this chart, we see a different color. The yellow blocks result from the fact that there is only one data point for that model. A standard deviation cannot be determined from one data point. The general ranking of the optics models, with regards to the vertical deviation from the 10 data points, remain consistent with the overall results, regardless of the small sample size.

../../Desktop/Report%20Graphics/100%20yard%20summaries/100yd%20vertical%20deviation%20by%20optic%20ID.jpe

Compared with the overall summary of all distances, the vertical deviation results at 100 yards remain relatively consistent. It is interesting that there was no consistency between the MRO models. From the data we collected, the MRO optic type displayed the least consistent observation results when compared to the other models. The extreme shift that was observed in Optic ID # 11 was confirmed by two separate observers as represented in the data points and the relating movement diagram drawn by the observers. While we do not have an even number of data points with all the optics on this chart, there is an extreme difference between the top optics and the bottom optics. The results of the MROs and T-1s were a great surprise to the testing group, who expected parallax to decrease significantly at further distances as the EOTech did.

100 Yard Horizontal Movement Evaluation:

../../Desktop/Report%20Graphics/100%20yard%20summaries/100yd%20horizontal%20deviation%20by%20optic%20type.jpe

Again, we see consistency in the relative rankings compared to the overall summary charts.

../../Desktop/Report%20Graphics/100%20yard%20summaries/100yd%20horizontal%20deviation%20by%20optic%20ID.jpe

As we break out the results here by Optic ID number, we see that some of these optics showed a very large increase in deviation when the testers observed for movement while adjusting their horizontal viewing deflection as opposed to the previous chart that displayed the results from the users adjusting their vertical viewing angle. Specifically startling were the observations from Optic # 11, which testers reported that the aiming dot nearly moved completely outside of the 24” wide calibration target at 100 yards.

100 Yard Total Deviation Summary:

../../Desktop/Report%20Graphics/100%20yard%20summaries/100yd%20total%20deviation%20by%20optic%20type.jpe

The average table for 100 yards, by optic type, shows a very clear separation in deviation amount between the four optic types. Testers were very surprised by the Burris during the whole of this test, but especially at 100yds– considering it is a micro red dot and not generally considered in the same class as the larger optics in the test that are more commonly employed at this distance.

../../Desktop/Report%20Graphics/100%20yard%20summaries/100yd%20total%20deviation%20by%20optic%20ID.jpe

The by-Optic ID # breakout continues to fall in line with the ranking trends of the overall summary of all distances, with the same effect of the lower ranking optics changing position.

100 Yard Rate of Change Summary:

../../Desktop/Report%20Graphics/100%20yard%20summaries/100yd%20%25%20change.jpeg

The percentage change in angular deviation generally falls into what is commonly understood, that optics are less susceptible to parallax at longer distances. However, as before, the change was not linear and varied greatly between the optic models.

Original Test Summary

The results of this test are representative of independent tester evaluations, solely of the specific optics that were tested, under the conditions tested. They cannot be construed to represent all optics produced under these model lines. While the various optics of the same model type have large differences in the sequential numbering of their serial numbers and most probably indicate a significant difference in production dates– there is insufficient data to draw conclusions or comparisons between production years.

There is one interesting fact of this test that has not been addressed during this test report. That is, whether the optics that were tested– that have excessive and irregular parallax deviation could be sent back to the manufacturer for repair. That brings us to one specific optic in this test. This optic was donated for testing by a volunteer that could not attend the testing. The owner of this optic had noticed first-hand, irregular and excessive aiming dot movement. These errors presented both visually, and through POI shift during consecutive groups. This owner contacted the manufacturer directly about this issue and requested resolution. The manufacturer agreed to an RMA and the owner sent the optic to the manufacturer. The owner received an email confirmation that the optic was received and then later that it had been fixed. The owner then received the optic in the mail. This optic was ID #10, and Aimpoint T-1. Unfortunately, Aimpoint would not disclose to the owner what parts were repaired, altered, or replace. It was not disclosed to the client what, if any, physical defect was present.

../../Desktop/Report%20Graphics/IMG_1482.JPG

This specific optic’s deviation was measured, overall, by 15 separate data points as opposed to 10 data points to the other T-1 tested with 10 separate data points. The overall results of this optic that had been repaired by Aimpoint differed to the non-repaired T-1 by 0.135327493 inches of Average Overall Deviation and 0.064025083 inches of difference in Standard Deviation in inches in the overall summary results.

Both the testing group and I are confident in the results we have recorded and we are releasing all raw data and calculations, to include: tester sheets, photographs of optics and serial numbers, testing conditions, protocols, and procedures used. The intent is to be as transparent as possible and afford the ability to independent and interested parties to not only confirm the summaries represented in this report, but also so that the test can be replicated independently with other optics of the same type and model so that the results here can either be confirmed, corrected, or disputed. We encourage others to do so and welcome the results of their outcomes.

Follow-on Test Results and Comparison

SUMMARY OF OVERALL RESULTS

After the initial testing that we hosted, we published the testing procedures and invited anyone who was willing to replicate the testing with their own optics. The independent remote testers then submitted their testing forms and they were added to the results. The following section summarizes the data that was submitted and compares it to the original testing.

Overall Vertical Movement Evaluation:

Vertical Deviation, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%20All%20Dist/FO%20Overall%20Vertical%20by%20Type%20Summary.jpeg

This table summarizes the results of the data recorded by user evaluations during the vertical movement test, by optic type and at all distances. As shown, this data is derived from 72 individual test reports.

Vertical Deviation, All Tests

This table represents the combined data from the Original and Follow-On tests and is comprised of 179 data points. The only two optics that are common between the Original tests and the Follow-On tests are the EOTech EXPS 3.0 and the Aimpoint T-1, which both show fairly consistent results between the respective tests. We also see that there is a drastic performance difference between the Aimpoint T-1 and T-2, with the T-2 performing almost as well as the EXPS 3.0.

From the Vertical Deviation results, we continue to see a clear and consistent separation in measurements between the optic models.

Overall Horizontal Movement Evaluation:

Horizontal Movement, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%20All%20Dist/FO%20Overall%20Horizontal%20by%20Type%20Summary.jpeg

From the Overall Horizontal Deviation by the model type summary chart above, it is observed that there are changes in rankings with many of the optics, however, the EXPS 3.0 and the T-2 remain in the top of the rankings. The EOTech 516 tested showed significantly more sensitivity to horizontal head movement than vertical head movement, as did the Aimpoint Comp M4 and the Vortex StrikeFire II.

Horizontal Movement, All Tests

The table above shows the Horizontal Deviation test results from both test series. Again, we can see the trend of a greater precision of results with the optics that exhibited less deviation continuing, as it did with the original tests.

Overall Total Deviation Summary:

Total Average Deviation, All Tests

Total Average Deviation, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%20All%20Dist/FO%20Overall%20Total%20by%20Type%20Summary.jpeg

The two charts above compare the overall results of both tests (above), and the follow-on test (below). Comparing the two charts, using the data from the two common optics models (EXPS 3.0 and T-1)– we can see a measure of consistency between the tests. In the follow-on tests, the EXPS 3.0’s results only differed by 0.020680957 inches in standard deviation. The AimPoint T-1 only differed by 0.441285131inches in standard deviation between the results. The Leopold LCO surprised all testers involved with its large degree of diagonal movement.

SUMMARY OF 25 YARD RESULTS

25 Yard Vertical Movement Evaluation:

25 Yard Vertical Deviation, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%2025yd/FO%2025%20Yard%20Vertical%20by%20Type%20Summary.jpeg

In this chart, we see that 23 data sheets were available and used to produce the summaries for the 25-yard results. Unfortunately, there were a few optics models that only have one data point, that was submitted by remote testers. These are evident by the “N/A” in the Standard Deviation fields, as a standard deviation cannot be derived from one result.

25 Yard Vertical Deviation, All Tests

Regardless, the chart above of the overall test results remain consistent with model ranking. The Trijicon and Leopold models tested showed a significant amount of movement at 25 yards, exceeding 4 inches (>16MOA). This may be a concern for end users that may not be able to precisely keep the aiming dot in the center of the viewing window consistently due to CQB environments, NVG use, or Protective Mask use. On the other hand, the EXPS series and the T-2 show a level of viewing angle forgiveness that is measurably superior to the lower ranked models.

25 Yard Horizontal Movement Evaluation:

25 Yard Horizontal Movement, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%2025yd/FO%2025%20Yard%20Horizontal%20by%20Type%20summary.jpeg

In the 25 yard summary by optic type for the horizontal deviation test (above), we see various changes in position. The only two optics that performed slightly better in the horizontal viewing angle test were the T-1 and SRS, the rest performed a bit worse- to varying degrees.

25 Yard Horizontal Movement, All Tests

When the horizontal deviation results from the follow-on tests and the original tests are combined in the chart above, we continue to see the previous trends of rankings.

25 Yard Total Deviation Summary:

25 Yard Total Average Deviation, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%2025yd/FO%2025%20Yard%20Total%20by%20Type%20Summary.jpeg

Above we see the total averages for the Follow-On tests, which show a level of consistency in rankings for most optics– except for the Comp M2 and Comp M4 which move around depending on how the data is broken out– as they showed more or fewer degrees of sensitivity to horizontal head movement.

25 Yard Total Average Deviation, All Tests

The combined testing results above, compared with the combined results of the vertical and horizontal testing show that there were only four optics models that maintained an amount of movement that was less than 2 inches at 25 yards: the EXPS 3.0, EXPS 3.2, T-2, and the Fast Fire.

25 Yard Rate of Change Summary

However, we do see that there is no consistency between the models with the generally perceived “rules” with regard to parallax at closer distances. The charts below demonstrate this:

25 Yard % Change, Follow-On Test

25 Yard % Change, All Tests

The chart above displays the % change in deviation at 25 yards, as compared to the average results at all distances– while the chart below it shows the same for the combined results of the original and follow-on tests. We do see that no optic model displayed less parallax at closer range. However, we also see that the amount of change between the optics models show drastically differing increases in parallax. Some optics models showed negligible increases in movement, while others increased by more than 50%.

Another interesting trend is that there is no apparent constant, with regards to any general rule as to whether all red dot optics tested displayed an increasing or decreasing degree of parallax deviation at closer distances than at longer distances. It is generally assumed that red dot optics are more susceptible at closer distances than at longer distances. While we do see that all optics displayed more angular (MOA) deviation at closer distances, there is not a constant rate of change. The optics that displayed significantly less deviation at closer ranges displayed a drastically higher rate of increase. The optics that displayed significantly higher angular deviation at closer distances displayed a significantly lower rate of increase. However, none of the optics displayed any form of a constant of change when compared with other optics models.

SUMMARY OF 50 YARD RESULTS

50 Yard Vertical Movement Evaluation:

50 Yard Vertical Deviation, Follow-On Test

The 50-yard vertical evaluation averages, by optic type, for the Follow-On tests have almost twice the data points as the 25 yards results do. This gives us the ability to have a standard deviation value for all but one optic.

50 Yard Vertical Deviation, All Tests

Report%20Graphics/All%20Testing%20Summary%20Charts/All%20Type%2050yd/All%2050%20Yard%20Vertical%20by%20Type%20Summaries.jpeg

The chart above, showing the combined results, begins to show a trend in rankings– with the EXPS series and the T-2 consistently towards the top and the MRO and LCO towards the bottom. The T-1 remains in the bottom half.

50 Yard Horizontal Movement Evaluation:

50 Yard Horizontal Movement, Follow-On Test

The horizontal deviation table for the Follow-On tests shows that the EOTech 516 remains more sensitive to horizontal head placement than the other EOTech models, as does the Aimpoint Comp M4. The Leopold LCO and the Vortex Razor both more than double their amount of deviation at 50 yards, compared to 25 yards.

50 Yard Horizontal Movement, All Tests

Report%20Graphics/All%20Testing%20Summary%20Charts/All%20Type%2050yd/All%2050%20Yard%20Horizontal%20by%20Type%20Summaries.jpeg

The overall horizontal deviation results continue to show a consistency of rankings with some of the optics. We also see that the Primary Arms optic almost doubled its movement from 25 yards to 50 yards. We also see the trend of that most of the optics display more parallax movement when the viewing angle changes horizontally, as opposed to vertically.

50 Yard Total Deviation Summary:

50 Yard Total Average Deviation, Follow-On Test

The 50 yard summary data (of the 41 available data sheets), by optic type, of the Follow-On testing in the table above continues to display the relative stabile top positions in rankings of the EXPS 3.0 and the T-2. The T-1 continues to be in the bottom 50% of the rankings with over twice of the movement of the top optics.

50 Yard Total Average Deviation, All Tests

Report%20Graphics/All%20Testing%20Summary%20Charts/All%20Type%2050yd/All%2050%20Yard%20Total%20by%20Type%20Summaries.jpeg

The 50 yard summary of both test groups outlines the misconception that most red dot optics are “parallax-free” at 40- 50 yards (depending on manufacturer’s claims). The Trijicon, Primary Arms, and Leopold models tested show the capability of almost to more than 6 inches of aiming error at 50 yards due to head position misalignment. These errors stand in stark contrast to the EXPS series and the T-2, which both present much less error– even with the extreme range in viewing angles used for testing.

50 Yard Rate of Change Summary:

50 Yard % Change, All Tests

Report%20Graphics/All%20Testing%20Summary%20Charts/All%20Type%2050yd/All%2050%20Yard%20%25%20Change%20by%20Type%20Summaries.jpeg

50 Yard % Change, Follow-On Test

As before, we see the same lack of a constant rate of change in deviation for the Follow-On tests as we move back in distance to 50 yards, compared to the overall summary of all distances. We do see the trend for all of the optics to generally display less movement at 50 yards, as opposed to at 25 yards, but again– the rate of increase/decrease is not consistent between models. One correlation is that the optics that display less overall movement, also display much more of a consistent amount of change in movement at variable distances.

SUMMARY OF 100 YARD RESULTS

Unfortunately, again, the 100 yard summary has the smallest sample size. This test continues to be the most tedious to conduct due to the precision of aiming required. The results are, nonetheless, interesting.

100 Yard Vertical Movement Evaluation:

100 Yard Vertical Deviation, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%20100yd/FO%20100%20Yard%20Vertical%20by%20Type%20Summaries.jpeg

In this chart we see the EoTech EXPS 3.0 display very little movement at 100 yards. The T-1 and LCO display a significant amount of error.

100 Yard Vertical Deviation, All Tests

Report%20Graphics/All%20Testing%20Summary%20Charts/All%20Type%20100yd/All%20100%20Yard%20Vertical%20by%20Type%20Summaries.jpeg

Compared with the overall summary of all distances, the vertical deviation results at 100 yards are surprising. Although there is only one testing data point for the Fast Fire, the tester reported no movement, whatsoever. The LCO exceeded 6 inches of movement, the T-1 exceeded 8 inches and the MRO exceeded 9 inches.

100 Yard Horizontal Movement Evaluation:

100 Yard Horizontal Movement, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%20100yd/FO%20100%20Yard%20Horizontal%20by%20Type%20Summaries.jpeg

Again, at 100 yards, we see the tendency of all the optics tested to be more sensitive to horizontal head position than the vertical head position.

100 Yard Horizontal Movement, All Tests

Report%20Graphics/All%20Testing%20Summary%20Charts/All%20Type%20100yd/All%20100%20Yard%20Horizontal%20by%20Type%20Summaries.jpeg

The sensitivity of the horizontal head position that we see in the optics tested show that the MRO tested is capable of inducing enough aiming error, due to head alignment issues, to miss an IPSC target “A” zone (15”) at 100 yards.

100 Yard Total Deviation Summary:

100 Yard Total Average Deviation, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%20100yd/FO%20100%20Yard%20Total%20by%20Type%20Summaries.jpeg

The average table for 100 yards, for the Follow-On tests, shows a good level of precision in measurements for the top and bottom optic– meaning both testers saw nearly the same result. As with previous results, the rankings remain relatively constant.

100 Yard Total Average Deviation, All Tests

Report%20Graphics/All%20Testing%20Summary%20Charts/All%20Type%20100yd/All%20100%20Yard%20Total%20by%20Type%20Summaries.jpeg

Looking at the combined test result table– we see the vast difference in possible aiming error between the optics models. However– rankings do remain relatively constant.

100 Yard Rate of Change Summary:

100 Yard % Change, Follow-On Test

Report%20Graphics/Follow%20On%20Testing%20Summary%20Charts/All%20Type%20100yd/FO%20100%20Yard%20%25%20Change%20by%20Type%20Summaries.jpeg

100 Yard % Change, All Tests

The 100 yard % change table is where we see a continuation of the general perceived notion that optics display less parallax at distance than at close range. With one exception– the AimPoint T-1 (in the Follow-On test table) and the EXPS 3.0 (in the combined result table), however the EXPS 3.0’s error amount was significantly less than the T-1’s.

Manufacturer’s Claims

One of the most significant aspects of the test is the comparison of the observed results, compared to the specific manufacturer’s claims as to the parallax characteristics of the optics. It should be noted that it is not made clear what aspect of parallax the manufacturer refers to in their product data. As parallax is defined as the apparent change of position of an object, viewed upon two different angles– it could refer to (in the case of this test) as to red dot movement or the actual target (viewing area) movement:

  • Aimpoint claims that the T-1 is a “1X (non-magnifying) parallax free optic” (Aimpoint, 2017), while the overall results showed an average deviation of 9.678492518 MOA from all distances and tests.
  • Aimpoint claims that the T-2 is a “1X (non-magnifying) parallax free optic” (Aimpoint, 2017), however, the average deviation observed across all distances and tests was 4.5 MOA.
  • Aimpoint claims that the Comp M2 is “Absence of parallax – No centering required” (Aimpoint, 2017), however, the average deviation observed across all distances and tests was 6.289849283 MOA.
  • Aimpoint lists no parallax claims on their website, that could be found at the time of publication, about the Comp M4 or the PRO.
  • Vortex claims that the StrikeFire II is “Parallax Free” (Vortex Optics, 2017), however, the average deviation observed across all distances and tests was 7.702254543 MOA.
  • Vortex claims that the Razor is “Parallax free” (Vortex Optics, 2017), however, the average deviation observed across all distances and tests was 15.59702284 MOA.
  • Trijicon claims the SRS is “PARALLAX-FREE” (Trijicon, 2017), however, the average deviation observed across all distances and tests was 16.26182102 MOA.
  • Trijicon claims the MRO is “PARALLAX-FREE” (Trijicon, 2017), however, the average deviation observed across all distances and tests was 13.37388861 MOA.
  • Leopold claims “The Leupold Carbine Optic (LCO) is parallax free” in an answer to the product questions (Service, 2017), however the average deviation observed across all distances and tests was 12.86041119 MOA.
  • EOTech claims that their optic is subject to parallax error of up to 14 MOA (EoTech, 2017). This claim is made generally on their FAQ page, without being model specific, however, the averages of the models tested across all distances and tests were: 1.658588792 MOA for the EXPS 3.2, 1.723615393 MOA for the EXPS 3.0, and 3.400581317 for the 516.
  • Burris claims that the Fast Fire 3 is “parallax free” (Burris Optics, 2017), however, users noted an average of 4.024137943 MOA of movement.
  • At the time of this testing, we could find no public claims by Primary Arms as to the parallax characteristics of the optic tested.

As we can see, there is a wide variance in what is claimed by the manufacturers and what is observed. All but Eotech, who over estimated error, failed to produce results that match the claims.

Closing:

The intent of this testing effort was to raise the bar for what is expected by consumers in the industry. This by no means should be considered an exhaustive, complete, or irrefutable work. On the contrary, readers should endeavor to test their own equipment in this form (or a better way) to produce some form of reproducible data. Too often do we engage in hero worship, group think, of product fandom to drive our equipment selection or opinions. We should have a data driven approach to equipment analysis, this way our conversations and debates can be centered in fact– instead of emotion. Emotional attachments to equipment the blind following of fan groups breeds a toxic environment where substandard equipment is allowed to persist. It also diminishes our ability to demand performance from the equipment we purchase with our hard-earned money and prevents us, as a community, from progressing technology. My challenge to you: Get out and prove this test wrong, prove it right, I don’t care– just get out and collect data. Be part of the solution.

Bibliography

Aimpoint. (2017, June 15). Aimpoint Micro T-1. Retrieved June 15, 2017, from Aimpoing Official Website: http://www.aimpoint.com/product/aimpoint-micro-t-1/

Aimpoint. (2017, June 15). Aimpoint Micro T-2. Retrieved June 15, 2017, from Aimpoint Official Website: http://www.aimpoint.com/product/aimpoint-micro-t-2/

Aimpoint. (2017, June 15). CompM2. Retrieved June 15, 2017, from Aimpont Official Website: http://www.aimpoint.com/product/aimpointR-compm2/

Burris Optics. (2017, June 15). FastFire 3 | Burris Optics. Retrieved June 15, 2017, from Burris Optics Official Website: http://www.burrisoptics.com/sights/fastfire-series/fastfire-3

EoTech. (2017, June 15). Holographic Weapons Sights FAQ. Retrieved June 15, 2017, from Eotech Official Web Page:

Service, L. T. (2017, June 15). Leupold Carbine Optic (LCO). Retrieved June 15, 2017, from Leopold Official Website:

Trijicon. (2017, June 15). Trijicon MRO Patrol. Retrieved June 15, 2017, from Trijicon Official Website:

Trijicon. (2017, June 15). Trijicon SRS. Retrieved June 15, 2017, from Trijicon Official Website:

Vortex Optics. (2017, June 15). Razor Red Dots. Retrieved June 15, 2017, from Vortex Optics Official Website:

Vortex Optics. (2017, June 15). StrikeFire II Red Dots. Retrieved June 15, 2017, from Votex Optics Official Webpage:

The Duty to be Prepared

In the last few days I have been fielding constant phone calls from clients, asking if they should be doing anything to prepare for uncertain times.  Many are worried about the upcoming election and are surprised when I say that they should be prepared no matter who wins.  Beyond the fact that general preparedness is good practice – if Hillary wins, I expect to see a massive run on gun and ammo sales that we will most likely not recover from due to impending executive orders.  On the other hand, if Trump wins – it is not unlikely that we might see civil unrest from organizations that have already been involved in such activities.  To add to the political and civil unrest, we need to look no further than our eastern seaboard in the last week.  We are always one disaster away from our power grid failing, our water system being compromised, or our emergency services being overwhelmed.

 

Duty, Responsibility, or Good Practice?

 

I have strong views on whether people should be self-reliant.  Let me pose something to you: Imagine a natural disaster like the recent hurricane that hit FL and the East Coast.  Now take 2 families.  One family has the means and ability to be prepared, yet chooses not to because they live a comfortable life and aren’t concerned.  Another family does not have the means or ability and is not prepared.  A disaster happens and the family without means dies or is injured waiting for emergency services to respond to them; while the family with means is being helped.  What is liability and responsibility to the family with means and ability?  I’m not speaking in a legal sense, more a moral sense.  I’m merely pointing out that by prioritizing purchasing the newest iPhone over having supplies – you place yourself in the situation of being an unnecessary burden on the system.

 

What do I need?

 

So, I want to break this down simply into two categories: natural disasters and civil unrest.  While both situations can share some of the same requirements and, quite arguably, have the need for firearms – I want to tailor this article to be a bit more inclusive to people who may not be a fan of firearms.  So, please spare the “needing firearms for everything” comments – I agree with you.  But, we need non-firearm families to be part of the solution as well.  Also, I want to state that this is not an all-inclusive guide to prepping. There are many sites, articles, and books for that.  This is about basic and common practices for average people who aren’t necessarily preparedness minded.

 

Natural Disasters

 

For me, I would prioritize this as the first thing to prepare for.  The reasons are that the incidence of occurrence is much higher (unless you live in a really bad neighborhood) and the steps you take will apply to almost all situations.  You will notice that with basic preparedness, I always lean towards initially outfitting with the goal of maintaining the ability to move away from the house or place of storage.  Measures that are heavier or bulkier, in my opinion, should come later.

 

ONE CASE OF BOTTLED WATER PER HOUSEHOLD MEMBER:  This is an emergency source of water should your utility system be compromised.  This isn’t an outlying chance – this happened in North Carolina just recently.  The reason for the bottled water is that it is very easy to separate and distribute, which gives you more flexibility should you need to go somewhere.  There are other measures you can take of course.

 

30 DAY SUPPLY OF FOOD PER HOUSEHOLD MEMBER:  I recommend getting the assorted buckets of Mountain House camp meals or similar.  Of course canned goods, MRE’s, and hydrated food are great – however, try to throw a few days of canned or hydrated food into a bag and walk a distance with it.  The dehydrated meals, again, give you a lot of flexibility should you need to move.  Something like this: https://www.amazon.com/Mountain-House-Just-Classic-Bucket/dp/B00955DUHQ/ref=pd_bxgy_468_img_2?ie=UTF8&psc=1&refRID=TD9SR7S9BGASSNH24ZWJ

 

ABILITY TO FILTER AND SANITIZE WATER:  Again – I prefer to maintain the capability of mobility here.  You will want something to filter out sediments and particles.  Many of these devices have microbial filters, but I still tend to not rely on them for complete sanitization.  There are some very nice ones like this: https://www.amazon.com/gp/product/B0007U00YE/ref=as_li_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=B0007U00YE&linkCode=as2&tag=realsurvi-20&linkId=TTRYW6MAJC7T7YDK

However, the price for these is high – and the ability to purify water definitely falls into the, “Two is one and One is none” category for redundancy needs.  So, if you can do one per family member – great. If not, look at something like this:

https://www.amazon.com/gp/product/B006QF3TW4/ref=as_li_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=B006QF3TW4&linkCode=as2&tag=realsurvi-20&linkId=QC33SOW4U4X2YPX3 .

Next you need the ability to sanitize water.  Of course you can boil it – but whatever fuel you have may be a finite resource.  Also, extra fuel for boiling is heavier compared to the following options.  A SteriPen is a great device:

https://www.rei.com/product/847549/steripen-ultra-water-purifier

It is very simple and self-explanatory to use.  It isn’t too expensive, so if you can’t get one per family member – just get one and then back it up with chemical treatment tabs like these:

https://www.rei.com/product/695229/katadyn-micropur-purification-tablets-package-of-30

There are, of course, many different chemical treatment options out there, not just these.  Just make sure you know how to properly use them.

 

Remember – water is important.  You can go a while without food – without water, you won’t last long.  These items give you the ability to either continue to use utility water that may be contaminated or obtain it safely from natural sources.

 

500 MILES OF FUEL:  In the case of natural disasters, fuel supplies can run short.  Pick up some good Gas Cans that don’t leak, fill them up and add a fuel stabilizer and safely store them in a shaded area.  The amount you need is dependent on your vehicle’s mileage and how many vehicles you might need.  This amount of gas should be adequate should you need to leave in case of an emergency and get yourself away from the affected area.  Most fuel stabilizers claim they are good for 1-2 years of storage, but I generally recommend only storing for 6 months, and switching out stores when the fuel blends change for the season.  At that point, just use the gas in your vehicle and re-fill the cans with the new season’s blends.  Another important note is that you should get into the habit of topping off your vehicle.  Don’t return home with your tank below ¾; top it off – things can develop very quickly overnight.

 

I’m going to stop the recommendations there to keep this focused.  There are numerous other things to think about, like flashlights/candles, backpacks, clothing, footwear, weather gear, safety gear, etc.  This is just a starting point for you to get a bit more prepared to be independent in the case of an emergency.

 

Offensive/ Defensive Capability

 

Before starting this part – I want to state that you should only get what you can use.  Unlike food, water, etc. – the use of a firearm is significantly less if you have no idea how to operate it.  You can grab a camp meal off a shelf, read the instructions, and be fine – not so much with a gun.  Get armed, but get training. I also want to state here that the following recommendations are starting points.  This is for the average person who might need to protect their home or make it to and from the gas station during a disturbance.  If you are bugging out and taking a 700 mile overland route to Montana, then this probably isn’t the article for you.

 

CONCEALED CARRY PISTOL FOR ALL LEGAL FAMILY MEMBERS (and license where required):  I have been getting this question a lot with the upcoming election and worries about gun control.  Most people telling me they want to get another AR-15 and wanting some recommendations.  They are usually taken off guard when I ask them first, “Does every member of your family that can legally carry, have a concealed carry pistol?”  We need to establish priorities based on the probable incidence of occurrence.  Of course more AR-15’s will be beneficial to stacking zombies in the front yard and they are one of the best home defense platforms.  However – it is most likely that society will not devolve to the point to where you will be able to stroll into your local Walmart in all your MultiCam glory with your AR-15 at the ready, without attracting significant law enforcement attention.  A good concealed gun gives you much more flexibility as to what places you can access.  You should also have 3 magazines PER pistol at a bare minimum.  The more loaded magazines you have, the less you will have to load when you may be pressed for time.

 

2,500 ROUNDS PER PISTOL:  This may sound like a lot, but it isn’t.  This amount is to protect you from ammunition shortages (which can last extended periods) and severe price fluctuations.  Remember – If ammo is scarce, it will have a high value – giving you the ability to trade or sell for other things you may need.  I’ll readily take a pound of rounds over a pound of gold if things get really bad.  Generally what I recommend to do is to buy in bulk.  Use a browser like www.Gunbot.net to find decent, cheap ammo.  Then do some math.  Take your estimated 6 month ammo usage (because you are training regularly AREN’T YOU?!?!?!?), add it to 2,500, then multiply it by how many primary pistols you have per household.  You only need 9mm rounds, so you shouldn’t have to break out special calibers.  Buy it and it can be shipped right to your house.  When you burn through your training allotment, reorder.  Make sure you are rotating stock as well.  You may ask, “What about self-defense rounds vs. training rounds?”  I’m just keeping the math simple for folks here.  Of course, good expanding pistol rounds are better than Full Metal Jacket cheap training ammunition.  I’ll always prefer people to carry rounds that reduce over penetration, but you have to make decisions based off of what your budget is.  I’d rather you carry with FMJ than not carry at all.

 

ONE AR-15 RIFLE:  This may ruffle some, but keep in mind this is a starting point.  One per legal family member is definitely a good thing – but it may not be in the budget.  Get a DECENT AR-15 from a reputable manufacturer.  Check legitimate reviews that aren’t from paid endorsers.  Get it set up right.  We all have opinions on how to set one up so I’ll keep this generic.  Have iron (backup) sights and an optic (red dot or scope).  Have a sling like a VTAC.  Have a mounted white light.  Have an extra set of batteries for ALL accessories stored ON the rifle (like in a buttstock compartment).  Have 5 (minimum) magazines from a reputable manufacturer.  Know how to use it.  Keep it clean and lubed.  Have it zeroed.  Also, SBR’s may be a bad idea.  You lose significant velocity and performance at range.  Make sure you take into account the size/weight vs. capability before going SBR.

 

5,000 ROUNDS PER RIFLE:  Same advice and math as the pistol ammo – buy in bulk and rotate.  Regular XM193, XM855, or 55gr Hornandy Lake City reloads will work – not everyone can afford to buy Black Hills TSX in bulk.  Just make sure it is that ammunition you are zeroed for.

 

WAIT, YOU FORGOT TONS OF THINGS!!!!!

 

            I didn’t forget anything; I intentionally tried to keep this as focused as possible.  This list could be a book if I included everything.  The considerations expand rapidly when you start to consider things.  But, that’s good – I’m glad you’re thinking about them.  Once you get the minimum items prepared – you SHOULD be thinking about more contingencies and adding/refining your gear as appropriate.  There is one glaring omission here:

MEDICAL GEAR

            After some deep thought, this wasn’t left out as a mistake.  When it comes to medical gear – carrying equipment you don’t know how to use can be dangerous.  Again – this article is not for the hardcore, “My whole street block trains together”, #prepperteamforlife audience.  This is for the average person or family that DOESN’T have ANY of these items.  If you’re mad about the medical omission, that probably doesn’t include you.  You should also know that you can carry all the combat gauze you want, but stuffing it into a tension pneumothorax won’t help the patient.  You can also field improvise tourniquets, chest seals, etc.  However, my rule of thumb for med gear is: ONLY CARRY WHAT YOU KNOW HOW TO USE.

Situational Awareness

I’m often asked by clients how they can increase their safety while out in public. I find this question especially relevant with recent events. Too many times, the answers I hear to this question in the industry is focused on the latest gear, the popular EDC of the day, or the latest hand to hand fad that the coolest internet personality is shilling. The best solution is often the simplest, but it is ignored because it is not easy. Why put in effort when you can buy the latest piece of gear?

 

The Problem

 

People have become extroverts on the internet and introverts in public. We have become cocooned in our technology and convenience. We don’t talk to our neighbors or take interest in our community unless it is on the latest phone app or internet forum. Everywhere you go you see people buried in their devices, disconnected from the real world. It is a double edged sword. We have become much more in touch with global events, but are now completely oblivious to what is happening around us. Terrorist events are constantly happening in the United States. One of the popular TTP’s of the day are pressure cooker bombs as we saw in Boston and now recently in NYC. How many people do you think walked right by them or didn’t notice them being place because they were buried in their phones?

 

The Answer

 

The solution is simple. Look around. Pay attention. Look, Listen, Smell. Absorb and process your surroundings, constantly. Of course this takes effort and constant repetition until it is habit.  It isn’t as comfortable as staring at your Facebook feed. It isn’t paranoid to be aware- it is our natural, primal state. We were built with the instincts for survival, to be alert for predators or threat- we have just become domesticated. Here is a question I get: “When you go out- what color state of awareness are you?”. It makes me shake my head to even write it. Let me be clear here- if you think a checklist or flow chart will make you effective in a situation, you have been misinformed. The idea that you should constantly check your state of awareness that it is in line with the appropriate color is preposterous. The thought that you need an OODA loop to respond to a threat is ridiculous. Too often we think that some quantifiable step by step program is the answer to life, when it is not. Life is dynamic and constantly changing. Attempting to shoehorn the environment into rigid steps and categories limits ourselves.

 

YOU ARE EITHER AWARE OR YOU ARE NOT.

 

I can’t make it more simple than that. Of course, it will undoubtedly be pointed out that high levels of awareness will mentally fatigue you. That is correct. However, it gets less fatiguing with practice and repetition. Life isn’t easy to begin with, suck it up and do it anyways. Be aware of your state of mental fatigue and use that awareness to feed into your decision making progress.

 

Be a Sensor

 

Here is the high effort part. Develop visual scanning habits. People watch. Use the mirrors in your vehicle. Constantly. Take mental note of what you see and try to remember it. Visually observe the path or road ahead of you- if you see something you don’t like, then adjust accordingly. Plan ahead, constantly “what if”: instead of looking for the latest entertaining post on Facebook, imagine what you would do if a car cut you off, or if a threat appeared around a blind corner. Constantly analyze your surroundings: know what drivable terrain is when in a vehicle, know where your positions of cover or escape routes are when on foot. It isn’t being paranoid- it is being prepared. It is not being a useless air thief that is stealing oxygen from other individuals who have to put in extra effort to cover down on your lane because you think it is a waste of time or because you were told that your default level is “yellow”.

 

An Alert Society

 

What I want you to take away from this is a wakeup call. I want you to walk out the door tomorrow and immediately scan your driveway/street/neighborhood. I want you to be looking around as you drive. I want you to be observant of your surroundings. I want you to notice something out of place or an unattended bag- and REPORT IT. But most of all, I want you to be a DISCIPLE of awareness. Disciples are not just a followers- they are leaders. They seek out and creates more disciples. They spread the word. I want you to spread the word of awareness and self-reliance. I want you to spread the word of taking personal responsibility for not just your safety, but for that of those around you. I want you to create a network of citizen sensors that are monitoring everything around them. This is the way we create a safer society.

 

 

OPSEC in the Tactical Training Industry

OPSEC in the CQB and Tactical Training Industry

 

There is  great interest in the tactical training industry for tactics articles. Posts detailing CQB and other tactics frequently drive spikes in page views, social media shares, and increased business. But, what are the risks of such articles? How do we balance providing awareness against helping undesirable elements? I’m often asked specific questions about tactics or approached to write specific articles. Most of the time I decline. Often, the people that have approached me are confused as to why I would make this decision. They sometimes argue the rationale that the article will be posted in a closed group or on a vetted forum. This week, I’d like to explain my position on why I limit the videos, posts, and articles on tactical training, CQB, and similar topics to the extent I do.


 

ITAR

 

The best starting point to this discussion is the International Traffic in Arms Regulations (ITAR). There are wide ranging opinions about ITAR and its applicability- but it does exist. ITAR is the regulatory enforcement that the United States government uses to restrict defense-related information and equipment to foreign entities or persons. As enforcement, US Citizens and/or companies can be prosecuted and heavily fined under these regulations. ITAR is a very broad set of regulations, and compounding this is the broad content of publications like US Army Field manuals. Certain topics like CQB and Small Unit Tactics are contained in the regulations, and often derived either from these manuals directly or from the training an instructor received that was based on them. ITAR section 120.9(a)(3) clearly outlines this. If you are in doubt you may want to give it a read as this section is designed to regulate training in Tactics, Techniques, and Procedures. The enforceability of broad interpretations of ITAR is debatable, but you should remember that the government isn’t scared to prosecute you with ludicrous amounts of your own tax dollars, whether they have a good case or not.


 

Enemies of America

 

Another key point is that we are at war.Yes, that’s right. It doesn’t matter if we have pulled out of Iraq and Afghanistan is spooling down. We are not the sole authority as to whether we are at war or peace, the enemy has a say. Currently we are at war with Islamic Extremism (among others, depending on your views) both at home and abroad. Fort Hood, Dallas, and Boston are examples of attacks by these very enemies. The Dallas shooter, specifically, received tactical training from a tactics instructor in Houston. Ask yourself how it will make you feel if the next shooter reads your articles on tactics and then seemingly applied them.

Our country is also being invaded across our southern border. Thousands of “Special Interest Aliens” are apprehended by enforcement on our southern border each year. These SIA’s are from countries like Syria, Pakistan, and Afghanistan, which are full of terrorist operatives. And those are just the ones that are caught.

I think it is reasonable to take the standpoint that nobody should be providing training or information to people who want us either subjugated or dead. Would you agree?


 

Criminal Elements

What about those in our country who have less than honorable intentions? Those who specialize in crimes like home invasions, muggings, etc. Would supplying them with specific techniques and exclusive tactical training be detrimental to law abiding citizens? Lock picking and restraint defeat are popular topics these days. Would CQB training be an aid to a home invasion gang? Would marksmanship training help the next active shooter? I think these questions are worth asking and pointing out for the purposes of this discussion. It also broadens the scope of concern when it comes to the disclosure of tactical training and related information.


 

The Internet

Here is where all these problems really come to light. We use the internet to promote our businesses, share training, or to discuss things in forums. Some of these forums (or websites) are private and only accessible by users who the administrators choose. Let me be clear about something: IF IT IS ON THE INTERNET, IT IS NOT SECURE- PERIOD. This includes forums that are restricted to L/E or Military users. If you would not be comfortable posting something on a public domain where everyone in the world can view it, YOU SHOULD NOT BE POSTING IT. I understand that many of these sites are viewed as useful tools for the exchange of ideas between professionals in the industry. However, caution should be exercised as to the specificity of TTP’s. The same approach is valid for videos. It is worth considering whether hanging a video of ballistic penetration capabilities and the limitations of ammunition types when it comes to sections of vehicles is a good idea. Of course many of us want to validate our training by publicly showing proof of concepts or to refute someone else’s view. Often we do not consider the possibilities of our full audience.


 

Censorship

I am a firm believer and supporter of our Constitution. The Second Amendment guarantees our right to own firearms and to receive training in them. The First Amendment guarantees our right to free speech. This causes many in the industry to cringe when our Executive Branch of government proposes that YouTube videos could fall under ITAR. However, it detracts from the larger and more important question- SHOULD the information be shared? I don’t like the idea of the government dictating what we say, but should it have to? If we are so concerned about sharing valuable training information with each other, why are we equally not concerned about it falling into the wrong hands?


 

My Policy

We all have our lines to draw as to what we publicly disseminate and what we do not. I have published a series on the tactical rifle on YouTube, but it is limited to very basic fundamentals. Is it something that I would email directly to ISIS? Absolutely not. However, when weighing the content versus the value of dissemination- I don’t think it is out of bounds. On the other hand, I don’t post instruction on other certain topics that I feel would be irresponsible- Like CQB and lethal shot placement. BUT WAIT, THAT INFORMATION IS ALREADY ON THE INTERNET SO IT IS NO BIG DEAL! Sorry, that’s a cop out. I don’t care if some other instructor has posted information on the topic previously- it doesn’t validate the choice. Nor does it make it responsible to further expound on the topic. Limiting the dissemination of more sensitive topics to courses I teach gives me the ability to control who it goes to, to some extent. Although the people taught could then share the information with others outside of my control, at least I have some measure of evaluating the initial recipient. If I see something questionable about a prospective client- then I can refuse a slot to them, request further information from them, or just permanently add them to my training blacklist.


 

Conclusion

I have no doubt that some will not agree with some of my statements. Some will. I don’t have authority over others in the industry- they are free to do what they want. What I hope is that, if nothing else, this article may cause you to reconsider something you might post in YouTube, Facebook, or a private forum in the future. Perhaps you don’t post it, perhaps you re-write it so that it isn’t more specific than is necessary, maybe you decide to just meet with your audience in person. Keep in mind that the internet is not secure and it is forever. Once you share something on it- it can go anywhere.

Tactical Shooting vs Competition Training

Speed Without Thought

 

What comes to your mind when you think about a day on the range?

For many, it is the sound of Pro-Timers and shaving seconds off splits and reloads. Often it is running drills that are commonly found in a competition on IPSC cardboard targets or steel. The thought of burning down a drill that can be posted on YouTube for others to admire.

But what are you training?

What is your goal? What is your end-state? And are you actually improving your real world performance?


 

The End-State

 

It may sound counter-intuitive, but to know where to start, we have to know where to end. That is to say, not where our training stops- we should be constantly improving our skills. The end (or end-state) is what we are preparing for. It may be competition shooting, it may be hunting, or it may be self-defense. While some of these end-states may share similar skills, their application and focus may vary. Some of these end-states require more separate skills than others.


 

Training Methodology

 

Let me be clear- competition training can help in developing skills, AS LONG AS YOU MOVE ON. Fundamentally, there is one main difference between competition shooting and real world engagements. TARGET DISCRIMINATION. In most competitions, simple shoot/ no shoot targets are used by either reversing white and brown sides of IPSC targets or by spray painting hand outlines on them. In the real world, it is not so simple. Before pulling a trigger, we must decide if we SHOULD pull the trigger. Often the situations we find ourselves in when faced with a real world use of force scenario are not cut and dry. They may not be clean and they may not be obvious good guy vs bad guy. There are many shades of gray. You must have a thought process driving actions.

The thought process is one of the most difficult skills to train and many prospective elite military operators struggle with it. In order to decide to shoot the threat, we must first identify the presence of a threat through proper scanning/clearing, we must then discriminate the threat (a process I do not discuss on the internet). Once we confirm it is a threat we must locate the proper round placement (another topic I do not elaborate on the internet), then we must identify if a holdover is appropriate. This is the main difference between competition vs real world, and it is a huge one.  But why is this an important distinction? Can’t a competition shooter easily move on to this task?

Yes, and No. One of our constant struggles in shooting is managing our mental bandwidth. We have a finite amount of information we can process all at once. The less we are used to performing a task, the more we will have to focus on it and the more mental bandwidth we will use. This leaves less room to process other pieces of information and tasks. Speed affects our bandwidth as well. The faster we move, the faster we have to process multiple tasks. Now we mitigate this by getting used to performing tasks. TEACHING our brains how we want to process data, teaching it the correlation between what we SEE and what we want to DO.

What does this mean for the competition shooter? Hypothetically: if you are drilling say a steel challenge and we have you start facing away from the steel. We have you run the drill over and over again until you are shaving just 1/10’s of a second off your time. Then without you knowing, we move the targets around and change them- then initiate the drill again. What happens? The chances are pretty good that you instantly enter sensory overload. Why is that? Because you were not moving at a speed where you can THINK, and rightly so because it wasn’t necessary for your drill. This is where the rub is because the shooter has stayed at one level in the progression to real world training – they have become USED to moving fast. Once you are used to moving fast, it can be very difficult to force yourself to slow down.

I see this frequently in my Close Quarters Marksmanship courses with shooters that are competitive and have attended many professional shooting training courses. Often these shooters get their sights to the target very fast and their body is ready to take the shot immediately- however, their mind is not done with its tactical shooting tasks. In many ways the body and mind start fighting each other in a “I’m ready- No I’m not” argument that can often be visually observed. This results in not following the proper process and losing efficiency, shooting a no-threat target, or failing to engage a threat target. A good football analogy might be a quarterback who trains hard to take snaps, drop back and throw at a stationary target. He trains until he can perform the tasks perfectly and quickly. We then throw him in a real game with a full defensive backfield and we run 3 receivers. How do you think he does? I would propose that he does just as well as the speed shooter who is suddenly thrown into a scenario that requires a level of tactical shooting discrimination that he hasn’t trained for.


 

Summary

 

What I am saying here is that we need to have a layered approach to our training, especially in tactical shooting. We need to make sure that we don’t linger at individual layers, unnecessarily. Once we can mentally cope with one layer and can perform it efficiently- we need to move on to the next. If it is too much, we move down a layer and then move back up when we are able to. Staying at one layer for weeks, months, or even years just to gain Pro-Timer progress does not efficiently progress us towards our goals. Simple efficiency is your goal at each layer. If you can perform the task smoothly and efficiently- MOVE ON. Once you reach the level of performing all of your tasks in the layered process in the most complex and complicated scenario within your end-state, and can do it smoothly and efficiently- THAT is the time to focus on doing things faster. Why? Because now you will know how you have to move and think at your end-state. You will understand how all the layers apply and will know what is important and what is not. You will know where you can push things and where it could be detrimental. You will not waste time focusing on something where you should be focusing on something else. You have to know where you end to know where to begin.

The 3rd Sight Picture

The 3rd Sight Picture: Tactical Shooting

 

Often in today’s training industry, much is said about getting a follow on sight picture after firing a preceding round. The common saying is that if you fire 2 rounds then you should have 3 sight pictures. If you fire 3 rounds then you should have 4 sight pictures, etc. Is this really true and why is it a contentious issue? I feel that the reason this is a popular topic is because it is another task that is being taught out of context. And because much of the discussion is over close range engagements- that’s where this article focuses.


 

The Controlled Pair

 

Due in much part to 15yd flat range warriors, there have been many falsehoods and mantras that have that have developed about the controlled pair in shooting training. In many senses, it has become a drill all on its own- often accompanied by a whistle and Pro-Timer. The DRILL becomes the GOAL. However, the controlled pair is not a drill on its own. So, what is it?

The Controlled Pair is a method for dealing with multiple targets.

            That’s right, the controlled pair is only PART of a process, which is to efficiently engage multiple targets through tactical shooting. This means that while progressively engaging through a close range threat scenario, we place two rapid (but aimed) shots on each threat. After this is complete, we RECLEAR and re-engage as necessary. So, while we are placing two rounds on the threat at first- it is not necessarily true that we only shoot the threat twice. To be sure- if we had a situation where we legitimately had only ONE small area to clear that only contained ONE threat, we would continue to engage until the threat is no longer a threat.


 

Eye Movement in Shooting Training

 

We all know what moves first, from one target to the next, while engaging multiple targets right? THE EYES. Why is this so important? Because driving a gun from one point to another is easy. What slows us down is THINKING and in order to THINK we have to SEE. So, we have a number of tasks to accomplish before pulling the trigger again. We must continue scanning rapidly to LOCATE the next target. We then have to DISCRIMINATE (decide threat/no-threat) the target, and then find the proper POINT OF IMPACT to place rounds for an incapacitating shot. Then (if necessary) we have to identify HOLD OVER. It is critical for all these tasks to be complete BEFORE your sights enter the target because your line of sight will establish where your sights will go to. If we are running a rifle with an optic at close range and we have not shifted our eyes to hold over before the sights arrive we will either place rounds low or have to shift the rifle to re-aim at the proper spot. (Note: while I will not cover the specific method of target discrimination on the internet, I will state that it is not done with your focus on the point of aim or holdover. So, if you were thinking that a lot of eye movement is not happening on the target- you are incorrect)

It is ingrained in almost every shooting discipline and tactical shooting training that when engaging multiple targets, after firing the required number of rounds your eyes should immediately shift to the next target. This shift ideally happens as soon as the last shot breaks. Now, of course we could talk in circles and attempt to validate the 3rd sight picture methodology by stating that your 3rd sight picture is on the next target. But is that really accurate? Does it fit what we are attempting to train? Are we confusing the process by attempting to apply a general rule?


 

Training Methodology

 

Too often we allow TASKS to become DRILLS. Using a layered approach to training, this should not happen but this is what has occurred in this case. There are times during the initial stages of fundamental training where we DECOMPLICATE things. We isolate as few tasks as possible to perfect them. Of course it is true that we should not build bad habits, and in a certain sense- not reinforcing proper follow through does build bad habits. But this is in the GROUPING stage. Of course we should stay with the grouping stage for a considerable amount of time. The problem is- most shooters don’t. This results in instructors having to correct deficiencies in fundamentals during a later stage in the training where they shouldn’t have to be doing so. Let’s unpack what I’m saying here with an example of a training approach.

Before reaching the point of controlled pairs, we should have already mastered the following:

  • – Safe Handling
  • – Load, Unload, Reduce Stoppage
  • – Marksmanship Fundamentals (Through Grouping)
  • – Weapons Presentation (or the Draw for Pistol)
  • – Trigger and Safety Manipulation

 
That’s a fairly realistic list of skills that should be mastered before getting more aggressive with tactical shooting training. Now let’s think about that for a moment. If we have mastered those skills, then HOW LONG do we need to spend on controlled pairs before MOVING ON to multiple targets? Not a lot of time at all, I would say. You’re just firing 2 rounds, instead of the 5-10 you were during grouping (If any of you chuckled and said it is done FASTER- you are wrong and I have an article coming soon just for you). This task should only take a couple of iterations if you have trained the supporting fundamentals of tactical shooting properly. Unfortunately, many shooting training sessions consist of dumping countless magazines in never ending Controlled Pairs, accompanied by Pro-Timer beeps and whistle blasts.

So, if we are only spending a moment at this point- how are we building bad habits? Of course, every repetition counts- but consider this: If our desired end-state is to get our eyes moving as soon as possible to the next target or to locate the next possible threat in our area, are we building bad habits by remaining on our sights during the follow through of our second shot? Think about that. If you have done training on multiple targets, how hard is it to train yourself to release that sight picture and get to the next target? It is very difficult and often very unnatural. This is because our core instinct is to be threat based. To stare at the one thing that is our immediate threat. It takes effort, dedication, and training to look away from it. Just like it takes effort to look for a path of drivable terrain to miss a tree, which often results in seeing a car wrapped around a tree or pole while there were clear paths on either side.


 

Summary

 

There are situations where you need to stay on your sights. Just remember that these are fundamental tasks that need to be progressively layered from bottom to top in a streamlined process. Don’t live your life on a flat range at one layer in the process. Don’t practice TASKS out of CONTEXT. Train the process and apply them properly to the specific scenario you find yourself in. Don’t let TASKS become your DRILLS, otherwise your DRILLS won’t meet your END-STATE.