Pardon my bluntness but it is time for the debate regarding the need for “precision agriculture” to end. Farmers cannot afford not to adopt precision agriculture and the world needs the productivity increase it can produce. The future will be dominated by increasingly precise agriculture. Agriculture, done right, like any other professional undertaking, requires precision. Would one question whether his or her surgeon should operate with a sharp scalpel on the precise location requiring attention or whether one’s stock broker should place orders for the precise stock at the precise price desired? Agriculture is equally demanding.

Farmers, and the entire ag ecosystem, desperately need increasingly precise agriculture. Absent general adoption of precision agriculture with its new approaches and technologies that produce superior decision making, profitability, productivity and sustainability, the earth will become a very hungry and unstable planet. Don’t accuse me of being Malthusian. Malthus was wrong because he could not or would not foresee the advances wrought by innovators and science. Those predicting world starvation will be proven wrong again. However, it is up to us, just as it was up to the generations following Malthus, to prove the argument wrong. On a more personal level, innovation is the key to farmers enduring competitiveness.

Precision agriculture, like every other advancement of mankind, will represent a continuum of improvement. It will have limitations, those limitations will be addressed by serious players, the science will improve, the quality of data will improve, learnings within and without ag will be incorporated to improve further. Look at your first- generation iPhone for a good example outside of agriculture. What could it do in 2009, just 9 years ago, versus today’s hand size computer? I can recall the early days of data communications when we celebrated speeds of 4.8 Kbps. That was the late 80s. Now, we define data speeds against the standard of fiber to the home or 1 Gbps – 208 thousand times faster. What if we had said: “this stuff is no good”, it’s too slow and forgot about it. Some people did – to their great misfortune. Others saw the slow speed as a foundation to build on and a challenge to rapidly improve. That attitude fostered the continuous improvement that led to today’s marvel.

With multiple soil types reacting differently to various inputs, micro climates etc., production can vary hugely within small areas. We have measured 100 BPA (Bushels Per Acre) variability within a 150-acre field. If you are being guided by your consultant to make decisions at the field level, you are, very likely, not farming precisely enough. Conditions within counties vary widely. If you or your advisor are assessing your performance against the county average, your measurement is non-precise. Weather conditions vary greatly over periods of time and have major impacts on yields year to year. If you or your advisors are measuring performance against historical production without explicitly accounting for weather effects year to year, your measurement is, very likely, in error and will lead to unproductive decisions. There are 56 variables (some co-linear) outside of farmers control which MSD identified that, along with farming practices, determine yield. Soil types/conditions and various weather -related measures are most prominent. Generally, understanding the extent of the contribution of those variables at a sub-field level (management zone) is the only adequate basis for assessing the competitiveness of farmers’ performance and for assessing whether the farming practices implemented are creating a favorable or unfavorable trend and producing the desired results.

The questions in ag are not about data science. I have been responsible for data analytics/economic analysis/benchmarking in three companies large enough to fit within the top 5 ag companies in size. One, AT&T, would be far larger than any current ag company. My first benchmarking experience was at AT&T – in 1983 – 35 years ago. The mathematics driving benchmarking were mature at that time. The obstacle we faced was the same obstacle ag has faced: INADEQUTE DATA QUANTITY AND QUALITY. The data problem has been largely solved in communications, manufacturing, financial services and other industries. It can be solved in ag as well but only by serious, expensive, painstaking, consistent collection of lots of high quality of data. There are no shortcuts.

What do I mean by “lots”. At MSD we have collected over 1.3 billion subfield yield samples across 25 states and 7 years. We started in 2010. We have also collected 56 variables for each of those 1.3B samples, including soil and 20 plus years of weather data which we have individually correlated against the yield collected at the geo- granular level of 150 Sq.Ft. We, luckily, also benefitted from great weather variability (2012,13, 14).

What do I mean by “quality”? In year one we did not take steps to compensate for the inherently hostile environment ag represents – farmer inconsistency in yield monitor calibration, and environmental variation in heat, moisture, dust, etc. etc. etc. Ag data collection is different from any other of which I am aware. Open Air Equity Partners’ (my investment company) portfolio of companies collect data, process it, and perform data analytics for connected cars, homes, aircraft and more. No challenge compares to ag. Our reaction was to spend $80M to get “enough” and “get it right”. We inspected combines before permitting them to begin harvest of each field at extraordinary cost – a tech for 4 – 5 hours. The techs completed a 300-point check list, 30 of which directly related to accuracy of the collection of data. We calibrated. We monitored calibration. We imposed quality control processes following collection. We extensively forecasted and back-tested and achieved high R squared scores in our statistical studies. What difference did that make? Prior to adoption of those steps we disqualified almost two-thirds of the collected data. With those steps in place, we disqualified about 3% of the collected data. The problem we face today is not lack of science, including data science, capable of creating a step-function increase in productivity. The missing ingredient is enough precise data to fuel the science of precision agriculture.

I said earlier that farmers cannot afford not to adopt precision agriculture. Here is an illustration. We have a great data set for corn in Iowa. I posed the question: how much would an Iowa corn farmer’s profit per acre increase if production increased from the 50th percentile (average) to the 60th percentile (top 40%). By the way, these measures are not theoretical. They are actual production ranges, actually observed, as corn was harvested by our company and are explained by farmers adopting different and better farming practices. The answer, including incremental costs of increase production, was $20 per acre. Moving from average to top 40% does not appear herculean. In fact, we consider the 75th percentile as the “best practices” standard and achievement of that level of improvement would further increase profits. One can afford precision agriculture with these estimated returns! In fact, one cannot afford the status quo.

At MSD, working with GISC, the first Ag Data Cooperative, we aspire to apply our extraordinary data set and data science to Validate – Validate that progress is being made in crop yields after accounting for weather related variability; that production/financial goals are being met; that products and practices are producing the results required to justify the costs and producing ROI, that land is performing competitively compared to other comparable land and on and on. Unlike every other ecosystem of which I am aware, Agriculture has no J.D. Power, no Underwriters Labs, no Nielsen, Morningstar, etc. – no independent source of Validation, free of the conflicts that inherently arise when one is assessing a product it has sold or advice it has given. We are not and will not be conflicted by selling seed, chemicals or making product/practice recommendations. We intend to provide the factual foundation for others to make recommendations and decisions and we will continuously Validate. In most businesses in America, a simple formula guides the sequence of events: Plan, Do, Check, Act. (PDCA). This simple formula was introduced by the famous Edward Demming and led to the proliferation of quality and process re-engineering programs in the late 80s and early 90s and ultimately inspired the Malcolm Baldridge Award. MSD aspires to help others satisfy the “Check” function. Our mantra is “Innovate but Validate”

There is too much at stake to compromise, to take a “it’s good enough” approach whether that refers to products and practices or the quality of data supporting decision making. We can, increasingly know the best answer and act with precision and we can Validate that we have done so.