September 28, 2023

Wildlife Management: Scientism, Abstraction, Encapsulation, Interface

Today, I was reading Wretchard’s “The Case of the Missing Catastrophe,” over and over several times, as it contains some pretty heady stuff. As invigorating as the words may be, or perhaps mind-blowing, depending on one’s perspective and mental prowess, I believe it to be worthy of additional, relevant, thoughts, perhaps knocked down a peg or two into more understandable terms for common brains like mine.

What Fernandez is describing can be broken down into two distinct realities – deliberate manipulation and the exploits of useful idiots. Maybe I can make a bit more sense out of this.

Although Wretchard is discussing the predictions made by most media that we’re all gonna die because Donald Trump first became president and then endorsed recognizing Jerusalem as the capital of Israel, and the GOP is planning a tax reduction. Because prophecied catastrophes have failed to meet the cries of the media, and others, Fernandez suggests that the “models” which drive the predictions of death and destruction at the hands of liberals are being found out to be failures of the biggest kind. Some, not many, can actually recognize these failed predictions, based on “modeling,” and it is growing tiresome. Others lay claim that this is the reason “outsiders,” like Trump, got elected and why most people barely lifted a match, club or rock in protest of the Jerusalem capital decision. I think it safe to say that modeling, designed for outcome-based results, plays a vital part in our everyday lives.

Hidden behind intellectual topics of centralization, globalization, “integration with nature and society,” and such things as evolution and “intertemporal coordination,” what is being discussed is ideology. Idealism always begins with an idea. Where once “models” were the ideas of man to manipulate society, in today’s power and control institutions that more closely resemble technocracies than democracies, employment of computers to sort over ideas and information, hiding what is not wanted and fronting that which fits a narrative, is commonplace. Are we to now understand that somehow a person is exempt from a dishonest promotion of idealism because the “computer modeling” made them do it?

The intentions of modelers remain the same. Because of our love affair with technology and how it has been sold to the public, mentally programs us to believe the computer modeling is a better result than simply the ideas of a man. Strange isn’t it? The stage is set.

Computer modeling is common practice these days. It also works as a major tool of destruction in the ripping apart of society and politics (they go hand in hand as has been designed). The dishonest practice has caused major failures in the scientific world, even though those failures are the means to justify social and political perversion, to achieve agendas. It is a contributor to the injection of anger and hatred into our society as well.

For several years I studied computers and programming. I know enough to be dangerous. I do know how programming works – called coding today. I know how to hide and manipulate data to achieve desired results. That was one of the most basic instruments to learn in programming. Coding today requires knowledge of what end result one desires and writing a program to accomplish that. Imagine when this is placed in the hands of corrupt individuals, groups, corporations, 501 C3 Non Profits, etc. with something other than completely honest dissemination in mind.

I have often said that we live in a Post-Normal world today – up is down, right is left, right is wrong, black is white, etc. With enough money, anyone can pay a computer-literate technician to model anything. It has worked so well government agencies, along with our court system, eagerly rely on faulty and dishonest computer modeling in rendering decisions and crafting legislation.

In the case referenced in the linked-to article, the masses rely so heavily on a heavily manipulated Media, they are unaware that they are being propagandized by only those things they want you to know.

This same process is at play pertaining t0 wildlife management at every level in this country.

In the article referenced, I was taken by and it was pointed out to me, a quote that came from someone commenting on how computer programmers/modelers dealt with complex issues. “Encapsulation enables programmers to avoid conflicts … the code of each object still manipulates data, but the data it manipulates is now private to that object. … This discipline enables programmers to create systems in which a massive number of plans can make use of a massive number of resources without needing to resolve a massive number of conflicting assumptions. Each object is responsible for performing a specialized job; the data required to perform the job is encapsulated within the object

“Abstraction provides stable points of connection while accommodating a wide-range of change on either side of the abstraction boundary. … The abstract purpose is represented by an interface … multiple concrete providers can implement the same abstract service in different concrete ways.”

This is a pretty fancy way of stating that programmers can and are conning the rest of the world with their false manipulation of twisted and perverted data to achieve whatever they or anybody wants.

I have serious doubts that complexity is the issue when it comes to computer modeling. When the modeling is driven by corruption, for corrupt purposes, complexity is irrelevant only to the extent of the desired outcome and perhaps the need to present some kind of distraction or coverup by creating a fake controversy.

In computer modeling – bearing in mind that wildlife management today relies heavily on modeling whether they do it themselves or utilize someone else’s work – it is pointed out above that programmers deal with issues such as “encapsulation,” “abstraction,” and “interface,” to name a few. Combine these headings with corruption and we have new-science Scientism, i.e. “excessive belief in the power of scientific knowledge [real of false] and techniques [for corrupt reasons].”

First, a “programmer” (I placed programmer in quotes because that group or individual could vary from one lone programmer to accomplices of varying numbers.) collects data (what begins as useless information until placed in the desired order) and enters it into the computer. Then, someone must decide what data is useful, for what purposes it is useful and how to “encapsulate” that information, i.e. hiding information or using it to drive the outcome.

Encapsulating data is necessary for achieving desired results while hiding information that may cause conflicts or controversy. Politicians are masters at encapsulating information. That’s why they never answer the questions asked them. They hide what they don’t want you to know and sell you on what they do.

In today’s computer modeling, “abstraction” may be the single biggest mode of corruption, especially depending upon the chosen “interface.”

Abstraction, “the quality of dealing with ideas rather than events,” is where the real scientific process gets deliberately lost. Abstraction is necessary to promote ideas (idealism/environmentalism) rather than actual and honest scientific data. Several ideas/events can be contained within “boundaries,” including hidden data, and meted out through “interfaces” to only those listed (concrete providers) as in need (who are paying the money) of the results.

There is a common, tire-kicker expression used to describe the worthless computer-generated outcomes – “garbage in and garbage out.” In many of these cases that is precisely what is taking place. To some of us, the outcome is garbage because the input is garbage. It spells lots of dollars and cents to those dishonest people manipulating the truth. They are gaming the system for political or monetary gain.

Early on I said there were two distinct realities we are dealing with here; deliberate manipulation and the exploits of useful idiots. I would suppose that there is some overlap at varying degrees.

We must first understand that modeling and the effects of this method do not happen only inside a computer. Know that the “modeling” began in someone’s brain. It’s a process and yes, it can be a deceitful one as well. While the computer models yield results, often sought after results, the mind process is taught and carried down through many avenues of brainwashing and propagandizing. In short, we become programmed to think and operate as a computer modeling program in order to reach the desired end.

I have attended seminars in which the goal of the administrators is to manipulate attendees into becoming “change agents.” In other words, they want to brainwash (I know people don’t like that expression, however…) you to accept their propaganda (false modeling) and then go back to where you came from and change everyone’s thoughts to be like theirs. This is all a part of the “modeling” enterprise ruling our world.

Computer modeling is not always bad when used within the context of how it is achieved. It is almost never done that way and that is why my focus seems to be on the criminal aspect of deliberate and dishonest manipulation of the truth. The deliberate manipulators are those whose bent it is to deceive for monetary or political gain. We see computer modeling with such open-to-the-public exchanges involving climate change and wildlife management. Applying the methods I’ve described above, it is easy to see that dishonest encapsulation, abstraction, and interfacing can reap huge monetary windfalls as well as political gain and control.

Dishonest environmental and animal rights groups and there are thousands of them, pay lots of money to get computer models to promote their agendas. With an ignorant populace, who themselves rely upon computer modeled propaganda from multiple media sources, are quick to accept a model presented as a scientific finding. It is a part of our rigged system.

A book could be written citing all the cases where modeling is used as scientific fact for all the wrong reasons. The act is criminal, carried out by criminals.

And so, with those powerful enough to control the way wildlife management is discussed employing modeling as the foundation, is it any wonder that our fish and wildlife employees are nothing more than propagandized automatons, spoon-fed computer modeling as useful scientific data? These become the “useful idiots” who empower those corrupt purveyors of dishonest modeling as science.

When you combine the actual computer modeling with the “education” of the mental version of modeling, together, as change agents, we march into a dishonest world fraught with false knowledge and deception. Many within our fish and wildlife agencies across this land have been reared on modeling and taught the process resulting in a way of thinking that accomplishes the same thing.

Can this be reversed?

Share

How Accurate Are State Deer Harvest Estimates?

According to an article found at Outdoor Life“Hunters want it [deer harvest estimate] to be important, but they also don’t believe it. They say how can you know how many deer were killed if you didn’t check my deer? And that’s true. These estimates aren’t down to the individual deer, but scientifically this is an accurate and proven way to estimate deer harvest. Trends, though, are most important.”

It is my opinion that what hunters are interested in, at least initially, is a report from state wildlife officials as to the deer harvest, whether estimated or as accurate as it can be, in order to observe the trend taking place, and they don’t want to have to wait several months for that basic information. For those, like me, more interested in the actual harvest data, I understand having to wait a reasonable amount of time to get that data. For an “estimate” such guesses should be available within a few days of season closure.

But what of the science of deer management? If all wildlife officials are interested in is survey trends, I’m not so sure that I can have a lot of faith that the management plan is being laid out properly if the agency doesn’t know the population at any given point. There need be some kind of checks and balances in order to have confidence the modeling is working. Modeling has a poor track record. It would seem that using only trends would result in discovery too late in order to make adjustments.

Either way, the idea of the harvest estimates immediately concluding the deer hunting – or bear or moose, etc. – is for the hunters. It’s information they would like to have. It’s a way to inform them as to whether they are getting the best bang for their buck – pun intended.

HarvestEstimates

Share

Post-Normal Science Concludes Wolf Control Increases Livestock Depredation

PostNormalScienceBelow is the Abstract from a “quasi-experimental” study done in which outcome-based, paid-for conclusions determined, through modeling, that wolf control caused increases in livestock depredation in the year following disruptions to packs near livestock regions.

If an honest scientist were to accept the “quasi-experimental” research for what it is, I would assume that it would be consider mostly worthless nonsense. Overlooked in most of these studies are the words used to describe the quasi-results of modeling, i.e. “estimate, the odds, possible reasons, may be, may sometimes.”

It appears that for the actions they took, they used models and achieved some numbers. But do they really mean anything? First consider that this group of researchers got some of their information from, “wolf population estimates, number of breeding pairs, and the number of wolves killed,” from the U.S. Fish and Wildlife Services Interagency Wolf Reports. There should be little disagreement to the fact that these estimates are barely estimates, are deliberately low-balled and arguably inaccurate as hell. In short, they are political.

Missing from the study, from what I can tell, is factoring in to the modeling of what was transpiring with the natural prey base for the wolves. Certainly no real conclusions can be made unless all aspects of the natural prey base for wolves are accurately calculated and placed into the modeling equation.

Modeling is mostly nonsense and should be used, if at all, for purposes of discussion only as history, as short as it is with this kind of modeling, reveals it is extremely inaccurate and easily manipulated to achieve desired outcomes.

From my perspective, what gave away the biased intent of the study, is revealed in the Abstract where it states, “but we recommend that non-lethal alternatives also be considered.”(emphasis added) I wasn’t really aware that the purpose of “scientific” research was to make recommendations on how wildlife should be managed….unless of course the study was funded by someone looking for such a recommendation. If so, and it certainly appears that way, this is a classic example of “post-normal” or “new-science” outcome-based manipulations of reality. Also referred to as “romance biology.” It should have no place in any real scientific community and yet the push has been on for many years, from the Environmental Movement, to “find new understanding” and shifting the paradigm as to how wildlife management is discussed.

However, indications from the study might not be too far off in some of the things that were discovered, or revealed, whether intended or not. There was some discussion about how “disruptions” to packs “may be” a contributing factor to increased depredations on livestock by wolves. More and more studies, even from the real scientific community, are beginning to uncover troubling information that due to hybridization of wolves, normal and natural behaviors are causing reductions in the existence of the progeny of the breeding female within a pack. This results in multiple litters within a pack. The changed behavior infused by hybridization, combined with multiple litters, i.e. larger than normal packs, “may be” contributing to coincidental, small increases in livestock depredations in what appears to be the year following a culling of wolves by something in the order of less than 25%. Where is this information made available in this study?

Few, myself included, will argue with the point that little change will result in livestock depredations without, at least, a reduction in wolf numbers that exceed 25%. That’s the entire point of wolf control and better management.

Please read the complete study, linked-to below, but at least approach it with a better and more honest understanding of what it is and isn’t telling us. The bottom line is the data being used are estimates, therefore the modeling outcome is also only an estimate. It is not accurate in any way. There is nothing conclusive to this study.

Abstract

Predator control and sport hunting are often used to reduce predator populations and livestock depredations, – but the efficacy of lethal control has rarely been tested. We assessed the effects of wolf mortality on reducing livestock depredations in Idaho, Montana and Wyoming from 1987–2012 using a 25 year time series. The number of livestock depredated, livestock populations, wolf population estimates, number of breeding pairs, and wolves killed were calculated for the wolf-occupied area of each state for each year. The data were then analyzed using a negative binomial generalized linear model to test for the expected negative relationship between the number of livestock depredated in the current year and the number of wolves controlled the previous year. We found that the number of livestock depredated was positively associated with the number of livestock and the number of breeding pairs. However, we also found that the number of livestock depredated the following year was positively, not negatively, associated with the number of wolves killed the previous year. The odds of livestock depredations increased 4% for sheep and 5–6% for cattle with increased wolf control – up until wolf mortality exceeded the mean intrinsic growth rate of wolves at 25%. Possible reasons for the increased livestock depredations at #25% mortality may be compensatory increased breeding pairs and numbers of wolves following increased mortality. After mortality exceeded 25%, the total number of breeding pairs, wolves, and livestock depredations declined. However, mortality rates exceeding 25% are unsustainable over the long term. Lethal control of individual depredating wolves may sometimes necessary to stop depredations in the near-term, but we recommend that non-lethal alternatives also be considered.

<<<Link to Complete Study>>>

Share

Predicting Human/Wolf Conflicts

Note: I’m still laughing!

Abstract

Due to legislative protection, many species, including large carnivores, are currently recolonizing Europe. To address the impending human-wildlife conflicts in advance, predictive habitat models can be used to determine potentially suitable habitat and areas likely to be recolonized. As field data are often limited, quantitative rule based models or the extrapolation of results from other studies are often the techniques of choice. Using the wolf (Canis lupus) in Germany as a model for habitat generalists, we developed a habitat model based on the location and extent of twelve existing wolf home ranges in Eastern Germany, current knowledge on wolf biology, different habitat modeling techniques and various input data to analyze ten different input parameter sets and address the following questions: (1) How do a priori assumptions and different input data or habitat modeling techniques affect the abundance and distribution of potentially suitable wolf habitat and the number of wolf packs in Germany? (2) In a synthesis across input parameter sets, what areas are predicted to be most suitable? (3) Are existing wolf pack home ranges in Eastern Germany consistent with current knowledge on wolf biology and habitat relationships? Our results indicate that depending on which assumptions on habitat relationships are applied in the model and which modeling techniques are chosen, the amount of potentially suitable habitat estimated varies greatly. Depending on a priori assumptions, Germany could accommodate between 154 and 1769 wolf packs. The locations of the existing wolf pack home ranges in Eastern Germany indicate that wolves are able to adapt to areas densely populated by humans, but are limited to areas with low road densities. Our analysis suggests that predictive habitat maps in general, should be interpreted with caution and illustrates the risk for habitat modelers to concentrate on only one selection of habitat factors or modeling technique.<<<Read More>>>

Share

Montana FWP: Modeling More Accurate Than Counting Wolves

“Researchers conducted a study of the new technique from 2007 to 2012. The new method, called patch occupancy modeling, uses deer and elk hunter observations coupled with information from radio-collared wolves. The statistical approach is a less expensive alternative to the old method of minimum wolf counts, which were performed by biologists and wildlife technicians. The results of the study estimate that for the five-year period, wolf populations were 25-35 percent higher than the minimum counts for each year.”<<<Read More>>>

Share

New Global Climate Status Report and the End of Sea Level Rise

This from the Space and Science Research Corporation

You are invited to obtain your personal copy of the just published Global Climate Status Report (GCSR)©,Edition 3-2013, dated September 10, 2013.

The GCSR is the only quarterly, non-governmental, authoritative, comprehensive, climate status report available in the US. It is published by the Space and Science Research Corporation (SSRC), in Orlando Florida.

This global climate report is an apolitical review of the climate and is focused on the factual status of the Earth’s climate. It is also based on the use of solar activity forcing (SAF) models for its climate predictions which have been shown to be the most reliable ones available in accurately
predicting climate change, especially when compared to the very low reliability greenhouse gas models.
The GCSR has as its Editors, Mr. John L. Casey, the SSRC President, and Dr. Ole Humlum. Mr. Casey is the author of the internationally acclaimed climate book, “Cold Sun”, (See www.coldsun.net) and was recently named “America’s best climate prediction expert” * Dr. Humlum is a
distinguished geomorphologist and glaciologist from the University of Oslo, Norway.
This special 85 page edition of the GCSR is unique in its coverage of the end of global sea level rise along with the regular status on the twenty-four climate parameters monitored by the SSRC. Additionally, this edition continues with its prediction of significant global cooling for the next thirty
years in view of the end of global warming and the start of a potentially dangerous “solar hibernation.”
Of particular significance in this edition of the GCSR are the guest commentaries from two respected scientists:

1. Dr. Nils Axel Mörner – one of the world’s foremost experts on sea levels. In his commentary, Dr. Mörner, from Sweden, reviews the wide range of values and confusion that exists in sea level measurement.

2. Dr. Dong Choi – a leading geologist and Director of Research at the International Earthquake and Volcano Prediction Center. (IEVPC), Canberra, Australia. Dr. Choi’s paper discusses important new findings that establish solid links between solar activity and earthquakes.

The SSRC has also just published the companion Executive Summary of this GCSR edition that provides a big picture view of the Earth’s climate status within 25 pages.

Both reports are now available at the “Publications” page of the web site for the SSRC at: www.spaceandscience.net.

Thank You.

Share