Introduction

The query that traces a debate in a parallel way between ‘what is science?’ and ‘what is a science about?’ has a lineage to the notion of scientific knowledge. The creation model of the earth and the evaluation theory conjecturally seem different since the methodological expansion of the account does not fit with each other. A scientific attitude needs to be opened for the natural laws, an explanatory module of experience, objectivity, and the conjecture of falsifiability, i.e. there may be an open account of the possibility to be flawed. This hypothesis invokes science as an intellectual activity poured by the experience of the world independent of human history and the human mind. Here the realistic explanation characterizes an account against some objectivity by synchronizing a fundamental principle or general formulae that covers all the phenomena of that particular entity without any exception, like Newton’s laws of motion that gives us the general formulae regarding the theory of motion that describes the relation between the motion of the object and the forces acting on it. James B Conant wonderfully says, “Science is a dynamic undertaking directed to lowering the degree of the empiricism involved in solving problems; or if you prefer, science is a process of fabricating a web of interconnected concepts and conceptual schemes arising from experiments and observations and fruitful of further experiments and observations.” (Conant 1954: 62).

Can we claim that a good number of the scientific explanations have some enrolments to the predictions that exemplify the causal efficacy between the past state of affairs and its future resultant? What sort of certainty is there? One more query that looks pertinent here is, “How could science be possible without the inductive and the deductive theorems?”.

Backdrop

We know that the Pythagorean orientation delimits the spooky religious analysis of the creation of the earth. The creationist account of world creation in a Biblical way vindicates knowledge of the mathematical harmony sited in the structural format of the universe. We are yet to restrain seeing the physical world through the spectacle of mathematics and logic. The meanings of scientific theories differ from the general vernacular uses of theories. Scientific theories are comprehensive, testable, and systematic evidential observations of the natural world, phenomena, and objects. Besides being compelling regularities, a comprehensive study of the science requires two types of constructing laws—universal law (regularities without exception) and statistical law (quantitative statement relies on proximity and probability). All the universal laws of science look logically simpler since the quantifier of the statement is nothing but the universal quantifier itself. For instance,

$$\left( {\text{x}} \right) \, \left( {{\text{Px}} > {\text{Qx}}} \right)$$

In physics, for everything or for all cases of (here x is universal or all). Now the point is that if any metal is heated (P stands for the heating metal in all cases of x), the same metal would be expanded (Q stands for an expansion of the metal in all cases of x). This is called the ‘thermal expansion of the non-quantitative laws’, whereas the statistical law articulates quantitative statistical statement based on the collected data sets like “India’s Covid-19 case fatality rate (CFR) slides to 1.7%”. The universal laws of science deal with an empirical generalization or empirically known facts and it assists in envisaging facts that are yet to be known from empirical and necessitation stances. Rudolph Carnap argues, “The laws of logic and pure mathematics, by their very nature, cannot be used as a basis for a scientific explanation because they tell us nothing that distinguishes the actual world from some other possible world. When we ask for the explanation of a fact, a particular observation in the actual world, we must use of empirical laws. They do not possess the certainty of logical and mathematical laws, but they do tell us something about the structure of the world.” (Carnap 1995: 11–12) Science relies on experimental laws, but these laws could not be applied in every case of science like astronomy. The astronomical objects are out of reach, so in the laboratory, the astronomers can generate certain conditions like the surface of the sun and moon or the solar systems, etc., that may look more alike to the astronomical objects. But the laboratory experiments look as if more physical experiments in preference to astronomy. I wrote elsewhere, “The chemical analysis can make a distinction between two similar objects. It can establish one as an element and another as a compound, though both of these satisfied all the other ordinary criteria for being the same objects.” (Chakraborty 2020: 85) The experimental laws become more accurate when there is a quantitative context that may be measured through observation. The growth of scientific knowledge does not rest on the subjective temperament. It is indeed an objective-based external knowledge that could know throughout the external verification and recurrence. To grow a scientific position one has to adjourn all the metaphysical speculations about the world and shrink the world into the contents of our sense experience. Our scientific signs of progress depend upon the following systems:

  • All the experiments done basing on observation, reasoning, and evidence must fulfil specific facts and theories.

  • The new theories are required to explain the older one adequately.

  • The amendment of the old theories through the conduit of the newer one would be formulated with an improved explanation and adequate evidence.

A J Ayer writes, “My own view is that generalizations of law should be taken as equivalent in content to generalizations of fact, and it would appear to be most in accordance with scientific practices to allow them unrestricted scope.” (Ayer 1972: 16).

Causal Efficacy and Scientific Ventures

In her inaugural lecture at the University of Cambridge in 1971, G. E. M. Anscombe explicitly relooks the paradigmatic form of causality where necessary condition and exceptionless generalization do not collide. Yemima Ben-Menahem writes, “Anscombe’s influential Causality and Determination (1971) argues, contra the Humeans, that we do indeed observe and experience numerous instances of causal connection: pushing, breaking, burning, and so on. I agree with Anscombe. Granted, there are also many less evident cases, where the causal connection is not observable, but the same goes for other relations; they too are manifest in paradigmatic situations and remote from immediate experience in others.” (Yemima Ben-Menahem 2018: 10) However, Aristotle in Metaphysics (2018, Book IV, Chap-5) and later Spinoza in his succinct work Ethics (Spinoza 2006) think over causality in terms of necessity. Spinoza says, “Given a determinate cause, the effect follows OF NECESSITY, and without its cause, no effect follows” (Spinoza 2006: Book I, axiom III). This connection between cause–effect remains a logical connection. The conception of conformity in logic instigates the rationality of a speculative argumentation. So there may be a categorical shift from logic to the methodology where rationality takes a considerable role. But Hume undermines causal connections by arguing that no logical contradiction could have occurred if we suppose a case where the antecedent happened, but the consequent does not ensue. Donald Davidson asserts, “But in the Treatise, under ‘rules by which to judge of causes and effects’, Hume says that ‘where several different objects produce the same effect, it must be by means of some quality, which we discover to be common among them. For as like effects imply like causes, we must always ascribe the causation to the circumstances wherein we discover the resemblance.’ Here it seems to be the ‘quality’ or ‘circumstances’ of an event that is the cause rather than the event itself, for the event itself is the same as others in some respects and different in other respects. The suspicion that it is not events, but something more closely tied to the descriptions of events, that Hume holds to be caused, is fortified by Hume’s claim that causal statements are never necessary. For if events were causes, then a true description of some event would be ‘the cause of b’, and, given that such an event exists, it follows logically that the cause of b caused b.” (Davidson 1975: 82–83).

In science, we inculcate explaining the event and elevate logical induction based on the criterion of repetition, but the whole process of induction contains the particular instead of the unobservable universal. One could minimize it in explaining the occurrences (effect) of the events that seem to be common sensual analyses. A couple of centuries before David Hume explores that the premises about the observed things could not entail anything that sounds unobserved, so here the justification could not underlying on the deductive method. Hume primarily challenges the dubious nature of synthetic necessary connection as we can conceive its contradiction. Hume believes that at least the contradiction of any laws of nature could be conceivable since the truth of the proposition expresses only the empirical matter of facts that has no linkage to a priori truth (independent of experience). Humean causation theory shrinks causation to legitimate behaviour and consequently takes in causation to determinism. Needless to say, Hume’s definition of causation lays down three interconnected conditions: succession, contiguity in space, and the invariable coincidence of the same event types. Concerning the focused problem, it delimits the scope of growth of knowledge since an interpretation of the observed entity may relate to some previously unobserved inferences. For instance, when a person perceives a red apple, the person never observed the taste of the apple that may deduce from an inference. We can shed the problem at a different level to emerge in the justificatory argument of ‘how could we derive an unobserved conclusion from the observed premises?’.

Two primary claims will inform the justification of the problem whose contours are to be developed. The first claim nourishes the function of inductive reasoning in terms of the past successes of the incidents, while the second claim justifies the process of accepting uniformity premises as our experience illustrates a uniformity of the universe. Hume argues that here one has to use an additional premise that seems unobserved; otherwise, the whole argument would lead towards a fallacy. Ayer writes, “And in any case, Hume is right in saying that we cannot have the best of both worlds; if we want our generalizations to have empirical content, they cannot be logically secure; if we make them logically secure, we rob them of their empirical content. The relations which hold between things, or events, or properties, cannot be both factual and logical.” (Ayer 1973: 219).

More interestingly, this unobserved premise could not be a priori in the Kantian sense. To make it more scientific, the additional premise might be an a posteriori proposition that relies on experiences instead of reasoning. To be more precise, the principle of the uniformity of the universe hinges an unobserved conclusion in science by depending on the observed premises to open scopes for an inductive knowledge that holds a justificatory expectancy (since a uniformity premise is an empirical one) in our belief systems. Hume’s stance liberates the forum of empiricism regarding the unobserved premises for getting its justificatory position, and the justification of the unobserved premises is the matter of inference from the observed propositions. The philosophical reasoning of disdaining the theory lies in the uniformity premise that manifests skepticism as to the unobservable premises, which could not be logically and scientifically resemblance (pretty generalization) to the observed premises. The principle of unwarranted assertibility emphasizes that the past incidents barely semblance to future incidents from the scheme of purported justification. However, Goodman argues, “To say that valid predictions are those based on past regularities, without being able to say which regularities, is thus quite pointless. Regularities are where you find them, and you can find them anywhere…” (Goodman 1972: 388).

When the laws are universal and depend on generalization issues, the elementary deductive logic plays its discretion by inferring unobservable facts. But in the case of statistical laws, the module of logical sequences entails probabilistic statements. The trail of induction versus deduction synchronizes the sphere of science. Induction in a traditional way contrasts with deduction so far as deduction goes from the general to the particular, while induction goes conversely, from the particular to the general. It is nothing but a misleading oversimplification. We can detect various types of inferences in deduction as well as many kinds of inferences that may be set up in induction. The conception of “inductive inferences” is yet to delimit into the facts to laws, but it logically illustrates the “non-demonstrative”, where the conclusion remains to barely follow the logical necessity when the truth of the premises is approved. We are emphasizing the concept of probability or degrees of probability that one may call inductive probability to an extent.

In a nutshell, it is nonetheless promising to see what sort of inductive probabilities are there in the form of inductive reasoning. Two types of probability—subjective and objective probability contend with the field of inductive inference. The ‘subjective probability’ consists of the probable truth-value of the subject’s belief about a particular proposition like expecting rain, whereas the ‘objective probability’ measures the probability of the occurrence of an objective incident independently of the concerned subject’s mind like the forecast of the rain by meteorology department. Bird clarifies, “However, contrasting with this is the indeterminacy of certain irreducibly probabilistic laws, such as the laws governing the decay of atomic nuclei. These are irreducible in the sense that there are no further facts or laws that altogether fix deterministically the decay of a particular nucleus (which would make nuclear decay like coin tossing). (Some physicists have believed that nuclear decay is coin tossing. Such a belief promoted Einstein’s famous remark that ‘God does not play dice with the universe’. According to Einstein’s view, there are hidden variables that do determine decay. It is our ignorance of these that makes decay look probabilistic. But most physicists accept that Einstein was wrong about this.) If there are irreducibly probabilistic laws then the chance that a nucleus will decay in a certain time will be an objective property of that nucleus. Irreducibility probabilistic laws put the case for an objective notion of probability in its strongest form.” (Bird 2003: 191–192).

Besides, the key feature of the deductive argument lies in the necessary relation between premises and conclusion because here one’s conclusion is contained within the sphere of the premises. So, the truth-value of the conclusion deduces from the truth-value of the premises. Let us make out the issue from the sense of causality. What does it imply to allege that one event caused another or premises caused a conclusion? The concept of causality is not an entity but a process that looks static. When we say ‘cigarettes cause cancer’, then we mean that the habit of smoking cigarettes or taking tobacco is a single cause for the event of cancer. There may be some other causes or conditions of having the disease cancer, but smoking cigarettes is the cause of the impact of cancer. Now the query is whether the causal relation is necessary or probable? It seems to be one of the relevant laws for the event, but it could not turn out as necessary laws. The causal relation in a precise sense stands for predictability, a type of potential predictability (not at all an actual predictability). Carnap suggests, “For this reason, when I use the term ‘predictability’ I mean it in a somewhat metaphorical sense. It does not imply the possibility of someone actually predicting the event, but rather a potential predictability. Given all the relevant facts and all the relevant laws of nature, it would have been possible to predict the event before it happened. This prediction is a logical consequence of the facts and laws.” (Carnap 1995: 192–193).

The problem is that one has to disregard the logical necessity from the entailment of objective necessity. An objective necessity comprises a causal empirical alliance between the objects and their properties connecting by the natural laws, while the logical necessity may secure the analytical proposition bringing the idea of a priori truth that tracks back to non-empirical systems. Scientific modification engrosses a shift in the standard beliefs of science and makes a trenchant transform of world-view, methodology and logical reasoning.

Karl Popper in Defence of Scientific Knowledge

In his celebratory work Introduction to Mathematical Philosophy (Russell 1919, reprinted 1950) Bertrand Russell inquires the sense of grammar to define what is this thing called number. The impetus of his writing on this mathematical philosophy came from the short and splendid book of Gottlob Frege tilted Grundlagen der Arithmetik (Frege, The Foundations of Arithmetic, 1960). The idea of number sounds pluralistic and there is a common criterion that may be shared in the same objects that numerically characterized certain collection or sets of the same objects. For instance, the number 12 may characterize as a dozen bananas. Russell (Russell 1950: 12–13) emphasizes two distinct ways to define the number of a set or collection.

  1. a.

    The definition of the intension of a class helps us defining the common and sharable properties of the class that could not be theoretically reduced to the definition of extension one. For instance, mankind or the people of India, etc.

  2. b.

    The definition of extension in actual enumerates the class or the set that could be reduced to an intensional one. For instance, a trio = x, y, z

These two ways lead to study the definition of number from three different angels:

First, since the number itself stands for an infinite collection, the process of enumeration (the definition of extension) could not deny it.

Secondly, in some cases where the given numbers of terms (like sets or collection) outwardly consist of an infinite collection or sets. For instance, there are infinite numbers of a dozen in the world, or else one has to believe in the finite model of the objects.

Thirdly, we theoretically prefer to define a number as infinite. It helps us to talk about the common properties of infinite numbers in terms of the definition of intension.

In a nutshell, Russell claims that the number of a class semblance with the member of that particular class, and he says, “One class is said to be “similar” to another when there is a one–one relation of which the one class is the domain, while the other is the converse domain.” (Russell 1950, 16) Here believing in the method of defining a number as a successor of the previous number (like number 2) is the successor of the number 1 etc., now the process of the successor would be equivalent to the finite number of the times as Russell argued. Even for Russell, a property in the natural number theory turns towards heredity, a module of succession, i.e. if the property belongs to the particular number P, the same property repeatedly would be in the successor of P, for instance, P + 1. This method hinges on the induction method, and Russell wonderfully argues, “A property is said to be ‘inductive’ when it is a hereditary property which belongs to 0. Similarly, a class is ‘inductive’ when it is a hereditary class of which 0 is a member” (Russell 1950: 21–22). Here the conception of hereditary and its successor trace back to the evidential support of the previous number, so the inductive logical anticipation seems to be more necessary and mathematically justifiable.

Let us remind a remarkable passage once said by great physicist Max Planck—“We have no right to assume that any physical laws exist, or if they have existed up until now, that they will continue to exist in a similar manner in the future.” Karl Popper has enormously disparaged this aromatic formula in his two well-known works The Logic of Scientific Discovery (Popper 1959) and Objective Knowledge An Evolutionary Approach (Popper 1972). According to Popper, the traditional philosophical problem of induction draws a streak between induction and the expectancy that the future would be in the vein of the past and disseminates also that the logic of scientific view would be analogous to inductive logic. Because the philosophers who hail the inductive method may reckon in the true edge of the logical analysis that is nothing but the inductive logical form where one could cease universal statement passing through a singular observational statement. In his book Objective Knowledge, Popper dubbed the commonsense problem of induction as the ‘bucket theory of the mind’ as the commonsense assists in making us believe in the regularities (repetition) of nature irrespectively being contrasted to the events that turn towards expectancy.

Hume nurtured two separate problems of induction regarding human knowledge—logical and psychological problems of induction. The logical problem of induction synchronizes the reason from the observation level to the unobservable conclusion because of the coherence and repetition of the incidents or events over time, whereas the psychological problem of the induction copes with the belief systems of the agent where we have some experience, and these experiences instigate us to expect to believe (a custom of habit in Hume’s word) in a particular forthcoming unobserved conclusion that would be followed by the previously observed premises. Hume rebuffs these principles to secure objective science and experience inclined knowledge that I discussed earlier. Popper daunts the approach of Humean logic but appreciated the implicit distinction between the logical and the psychological derivation of inductive reasoning. Popper reinstates the psychological terms (subjective) into the sphere of objectivity in the amendment of logical problems. It will help us dispose of the subjective psychological beliefs by placing them into the objective explanatory statements. Popper writes, “Once the logical problem, HL, is solved, the solution is transferred to the psychological problem, HPS, on the basis of the following principle of transference: what is true in logic is true in psychology. (An analogous principle holds by and large for what is usually called ‘scientific method’ and also for the history of the science: what is true in logic is true in the scientific method and the history of science.)” (Popper 1972: 6) For Popper, this synthesis would be riddance of Hume’s irrationalism regarding the method of induction. Even following Hume’s appreciative theory (an induction could not be replaced by repetition in logic), Popper pivots the principle of transference that outmoded any kind of induction by its repetition form not only in the scheme of the logical approaches but in conjunction with the scheme of the psychology or any other natural sciences. Popper and Hume are on the same page when the problem is oriented to the justification of the empirical reasoning of the explanatory universal truth by segregating the non-observed statements. They disdain this particular theory because ‘no number of true test statements would justify the claim that an explanatory universal theory is true.’ Although an inconsistency is commencing between them, but Popper restricts his theory and argues, “Yes, the assumption of the truth of the test statements allows us to justify the claim that an explanatory universal theory is false.” (Popper 1972: 7) The problem (logical induction) became severe as the truth-value (truth or falsity) of the universal law rests on the given test statements, even if the truth of test statements that find justification through ‘empirical reasoning’ could turn out a false explanatory universal theory. A panorama that Popper anchored in his writing trusts for a novel conjecture of thinking about deduction in science and converses the idea of ‘whiff of induction’ as the pursuit for justification dealt with induction are nothing but hypotheses that should be eliminated. A few of many arguments delineate a new light to the approach of formulating the probability hypothesis in science and philosophy. I appreciate the point when Popper evolutionarily argues, “The realization that all knowledge is hypothetical leads to the rejection of the ‘principle of the sufficient reason’ in the form ‘that a reason can be given for every truth’ (Leibniz) or in the stronger form which we find in Berkley and Hume who both suggest that is a sufficient reason for unbelief if we ‘see no [sufficient] reason for believing.’’ (Popper 1972: 30) The principle of sufficient reason comes together with the causal principle viz. sufficient cause.

In Popper’s view, the scientific conjecture does not initiate with the method of observation and they infer a general theory anchored in the finite number of past observation. Moderately, Popper believes that the scientific conjecture first sets off a theory that sounds an uncorroborated one, and hence they evaluate its prediction of being true or false with the evidential premises and a variety of testable consequences. If the conclusion turns experimentally falsified, the scientists seek out a new alternative. Donald Gillies writes, “Science does not start with observations, as the inductivist claims, but with conjectures. The scientist then tries to refute these conjectures by criticism and testing (experiments and observations). A conjecture which has withstood a number of severe tests may be tentatively accepted, but only tentatively. We can never know a scientific theory, law, or generalization with certainty. It may break down on the very next test or observation (as in the case of the discovery of black swans in Australia).” (Gillies 1993: 29) The place of induction in science is delimited in the Popperian hypothesis, while the inferences that value to science are in actual refutation-oriented and upholds failed predictions in parallel to premises and entails a theory at the back of these premises which is falsified too. The whole process appreciates only deductive reasoning that collides with an inductive one. Science, according to Popper, is a succession of conjectures and refutation that could be reinstated by a newer theorem. The concept of falsification is a defining criterion that assists science to cut off itself from the ‘pseudo-sciences’.

Why Objective Knowledge Matters?

The structural pattern of the world synchronizes having the tripartite models of conscious, unconscious, and Plato’s conception of sui generis (third world), a world that is made intelligible through Forms or Ideas. Popper develops an ontological distinction and brings back the problem of three worlds that eventually encompass the physical world (physical states), the mental world (mental states), and the world of the intelligible (the ideas in the objective sense). The second world (mental world) mediates the rest of the two worlds. How could we set up objectivity of the third world or is it a subjective element of our content of thought?

The problem of Plato’s leniency of the third world sounds one-sided since he emphasizes the universal notion that never takes the mathematical propositions seriously like ‘5 times 10 equals 50′ etc. A stoic’s formulation in so far as it entails a linguistic (language centric) analysis of the third world. Popper claims, “Theories or propositions, or statements are the most important third-world linguistic entities…It was the Stoics who first made the important distinction between the (third-world) objective logical content of what we are saying and the objects about which we are speaking. These objects, in their turn, can belong to any of the three worlds: we can speak first about the physical world (either about physical things or physical states) or secondly about our subjective mental states (including our grasp of a theory) or thirdly about the contents of some theories, such as some arithmetical propositions, and say their truth or falsity.” (Popper 1972: 157–158).

The realist group pioneered by Plato instigates the eternal verities of an autonomous third world, independent of the subject’s existence. While the empiricist group headed by Locke initially appreciates the non-constructiveness of eternal verities of Platoian Ideas but considers these as twaddle that sets aside from the real, man-made things for the usages of the human beings. Popper intertwines the two antagonist groups in his philosophical conjectualism and refutes the rigid sense of realist and idealist accounts. Popper says, “I suggest that it is possible to accept the reality or (as it may be called) the autonomy of the third world, and at the same time to admit that the third world originates as a product of human activity…According to the position which I am adopting here, the third world (part of which is human language) is the product of men, just as honey is the product of bees, or spiders’ webs of spiders. Like language (and like honey) human language, and thus larger parts of the third world are the unplanted product of human actions…” (Popper 1972: 159–160). Numbers, for Popper, are an excellent example of the man-made products of the human language in a Lockean sense, although the problem of the prime number seems autonomous, so it doubtlessly falls under the realm of the third world in the Platonic sense.

The growth of knowledge indeed underpins the language in these three worlds. My understanding suggests that we may take it from the satisfactory objective ground to exhibit the justification of the proposition of the cognitive significance in terms of the scientific statements. The growth of scientific knowledge fosters experience or observation in two diverse directions:

  1. a.

    Being allied to the objective properties that have a causal disposition.

  2. b.

    The attitude of universal law that transcends experience.

To sue a distinction between the observational and theoretical terms relying on the scientific convention of the logical terms sounds spooky. One could abstain from the circularity reasoning of the jargon all universal terms stand for the universal laws by entailing observational phrases here like there might be a universal term that does not fit with any universal law, but it has the sophisticated propensity to reduce in the observational term. Promoting the rejection of the theoretical and observational terms could consequently prop up the falsifiability of both accounts since its counter instance may be conceivable. So the conjecture of the scientific knowledge would be falsifiable in accord with the falsifiability of the accounts of theoretical or observational terms.

In a broad sense, the growths of objective knowledge in a rudimental way revives the sense experience and weaken the scope of a metaphysical hoof that speculates a world consisting of entities independent of experience. We cannot adopt the transcendental outlook of ontology in these scientific conjectures to congregate the skeptical stand against objective knowledge. Popper begs the question that observation can only counter the consistency theories but does not support these.

W. V. Quine, in his splendid work Pursuit of Truth (Quine 1992), initially agreed with Popper and the section entitled ‘Observation Categorical’ almost to the end of another section entitled ‘Test and refutation’; one could find a strong agreement between Quine and Popper. But in the last section of ‘Test and refutation’, Quine says something that Popper would disagree with:

It is clearly true, moreover, that one continually reasons not only in refutation of hypotheses but in support of them. This, however, is a matter of arguing logically or probabilistically from other beliefs already held. It is where the technique of probability and mathematical statistics is brought to bear. Some of those supporting beliefs may be observational, but they contribute only in company with others that are theoretical. Pure observation lends only negative evidence, by refuting an observation categorical that a proposed theory implies. (Quine 1992: 13).

As I understand this, Quine’s considerate position is that Popper looks right in arguing that observations by themselves cannot cling to a hypothesis. But Popper holds that in conjunction with theories that we accept, seems to be contrary to the empiricism of how we traditionally understood the theory. We are not accepting just because these hypotheses are supported by observations but because we have, to some extent arbitrarily, decided to add them to ‘the backlog of theory’. As Putnam points out, “A common objection to arguments from indispensability for physics to realism with respect to mathematics is, of course, that we do not yet have, and may indeed never have, the ‘true’ physical theory; my response is that, at least when it comes to the theories that scientists regard as most fundamental (today that would certainly include quantum field theories), we should regard all the rival theories as candidates for truth or approximate truth, and that any philosophy of mathematics that would be inconsistent with so regarding them should be rejected.” (Putnam 2012: 223) Moreover, the Quinian view insists on where the method of probability and mathematical statistics is brought to bear, hence there is such a thing as probabilistic support in retrospect of the development of mathematical statistics. The staking principle of the satisfactory explanation of science is meant to be set for a causal explanation where the explicans and explicandum have a causal and testable interface. The growth of scientific knowledge detains not only the logical emblem but the theory of empirical method (experience-based).

Spinning Out

The deductive method is the only logical technique that could be an elementary part of scientific conjectures. In this deductive process, according to Popper, the concluding proposition is entailed from the premises. Here the conclusion is submerged with the other pertinent statements to resolve whether they falsify in the sense of corroborating the hypothesis by eliminating the errors. In his work, The Logic of Scientific Discovery, Karl Popper (Popper 1995: 32–33) envisages four synchronized steps in defending deduction.

  1. a.

    The first step emphasizes a formal account where one could think about testing the internal constancy of the theoretical structure to justify whether it upholds any contradiction or not.

  2. b.

    The second step comes up with a semi-formal step where we underline two diverse elements together – its empirical elements and its logical elements. The scientific conjectures have explicit form by giving prioritizing the logical element to equivocate asking the wrong questions. Most scientific theories restrain analytic (i.e. a priori) and synthetic elements, and it looks inevitable to axiomatize these to distinguish the empirical from the logical one.

  3. c.

    The third step promotes a comparison account among the new theory with the accessible theories to expose a constructive advancement over these theories. It sounds essential since the process of adopting a new theory relies on the constitute advancement of the older one; on the contrary, if the new theory fails to resolve the previously unsolvable problems of the existing theory, without providing enhanced empirical content and shreds of evidence, there is no question of the adaptability of the new theory and its successors. This is the primordial cum theoretical process of science that expands the progress of science. At that point, Popper claims that the comparison of the existing and newer theories and its advancement relies on the deductive testing (more significant empirical content plus predicative power) of both theories in preference to an inductive one like in Physics where Einstein’s Theory of Relativity reinstated Newton’s Theory of Universal Gravitation.

  4. d.

    The fourth step pivots on the testing of a theory by its experiential application of derived conclusion. Here the showability of the truth-value of conclusion entails the unverified corroboration of the theory while the falsity of the conclusion undertakes a signal that could be logically falsified. The quest of the science would be to find a better theory or more precisely, the method of theory-testing would underscore the singular proposition deduced from the new theory, a type of risky prediction, but could be experimentally testable. Popper also argues that the corroboration of the new theories has relied on a new prediction that makes the older one falsified as adopting a compelling hypothesis, on the contrary, the falsification of the theory derives from its false prediction. Popper intends to celebrate empiricism as the growth of the scientific method entangles in the method of experience, but he does not believe in that experience, which in actual determines the theory rather, he appreciates the delimitation of any scientific conjectures that turns towards falsity.

In consequence, Popper writes, “For scientific laws, too, cannot be logically reduced to the elementary statements of experience. If consistently applied, Wittgenstein’s criterion of meaningfulness rejects as meaningless those natural laws the search for which as Einstein says, is ‘the supreme task of the physicist’: they can never be accepted as genuine or legitimate statements. Wittgenstein’s attempt to unmask the problem of induction as an empty pseudo-problem was formulated by Schlick in the following words: “The problem of induction consists in asking for a logical justification of the universal statements about reality…We recognize, with Hume, that there is no such logical justification: there can be none, simply because they are not genuine statements.” (Popper 1995: 36–37).

At his the Herbert Spencer lecture, Oxford University in 1914 Russell hopes, “The failure of philosophy hitherto has been due in the main to haste and ambition: patience and modesty, here as in other sciences, will open the road to solid and durable progress.” (Russell 1989: 120). The question about the justification of the scientific theory thrusts into the experience. The issue traces on the manner where a slight difference between observational and theoretical sentences are jotted down. The scientific conjectures have two different tools—analysis and falsifiability that cope with observational sentences to foresee a deduction method from theories. As we know that the classic inductive account ramifies the confirmation of theory in terms of the certain auxiliary conditions like A1, B1, and C1 that turn out as true, in contrast the falsifiability hypothesis cherishes the deductive method where the falsification of any auxiliary conduction may put together the whole theory as false. The refutation method of scientific conjectures involves the falsity of the sentence in an account of the falsity of the premises. What looks promising to Popper is the true refutation of the scientific theories, which manifested by the counter-evidence versus the confirmation of the falsification of any genuine scientific theory. Any true scientific theory could fall into the domain of falsifiability. An enterprise of scientific theory may conduit the methods of proof and disproof of the universal generalization of scientific statements. Let us take an example:

  • ‘All planets have elliptical orbits’.

The proposition is a universal theorem of all planets that must have elliptical orbits. But the concern seems opaque as the subject term of the statement, i.e. ‘all planets’ are yet to be proved. We do not have proper evidence or observation regarding the total number of planets. Here the subject term sounds disproof, while the predicate (elliptical orbits) hinges a certainty or proof about the figure of the orbits due to the number of planets we have observed to date that have their elliptical orbits. We can predict from a probabilistic sense that the next planets would have an elliptical orbit, and Popper has argumentatively challenged this type of inductive reasoning in his theory. Popper thinks that the conventional account of the inductive reasoning would be refuted only if there is an availability of the counter-evidence where we could find a planet that does not have an elliptical orbit. It’s truly hard to positively prove the confirmation ground of the scientific theory as most of the scientific theories are well-corroborated; a theory that sounds highly testable along with the criterion of it survives severe testing. Let me clarify here what is called the testability and severe testing of a scientific theory. The testability of a scientific theory can clutch a testable prediction incumbent with its degree of empirical content. In this account, there is a good possibility to find an exception and the potentiality to present the world more corroborated (highly tested and yet to be refuted) that may turn toward falsity. Besides, the severe testing of theory relies on the prediction that looks highly unfeasible in our accessible knowledge, like Einstein’s prediction about the bending of light. Popper believes in a degree of corroboration that is connected to its successors’ theories. Now a tremendous sense of entailment (covertly inductive) governs the theory that assists in seeing through back-ward processing on the simplification of its past performances. But we know that the vital edict of Popper intends to refute inductivism from the sphere of science. One may ask: ‘Could there be inductions free of severe testing in science?’ The growth of knowledge in Popper’s writings pivots a development of the falsified theory to the false theory that is yet to be falsified. The falsifiability hypothesis has an affable inclination to the future falsification of the present severe testing of a universal theory. Even Goodman’s new riddle credence on the future unpredictability of the present regulates the inductive concepts. Hilary Putnam considers, “Popper does not deny that scientists state general laws, nor that they test these general laws against observational data. What he says is that when a scientist ‘corroborates’ a general law, that scientist does not thereby assert that law to be true or even probable. ‘I have corroborated this law to a high degree’ only means ‘I have subjected this law to severe tests and it has withstood them’. Scientific laws are falsifiable, not verifiable.” (Putnam 1984: 354).

The question of anticipating the future success is a track of accepting the induction method in science, but Popper refutes the method and strikes against inductive inference by bringing the idea of ‘provisional conjecture’, where the scientific laws are yet to be falsified. To make science free from this inductivism, we have to understand the ‘corroborate’ scientific theorem. One should variance the account with the probabilistic way to validate the verification theory. Popper’s schemata pin down a sophisticated account of falsifiability, where the falsification of the prediction does not only falsify the whole theorem but opts for a highly corroborated theory. For Popper, the falsifiability hypothesis is a criterion not only for theories (logical and probabilistic dissent) alone, but for the methods of evaluating the conjectures. The concept of corroboration entails a background focused diagram, where we could look back at the history of the theory or its derivation, these simply present reports of the part effects to falsify the theory or the evidential/non-evidential (inductive) performances of the method. Here the prediction model could be placed as a surviving theory that structures a background condition of improbability relating to prediction. Putnam worried about the puzzlement here and claims, “A theory which implies an improbable prediction is improbable, that is true, but it may be the most probable of all theories which imply that prediction. If so, and the prediction turns out true, then Bayes’s theorem itself explains why the theory receives a high probability. Popper says that we select the most improbable of the surviving theories—i.e. the accepted theory is most improbable even after the prediction has turned out true; but, of course, this depends on using ‘probable’ in a way no other philosopher of science would accept.” (Putnam 1984: 357–358) Bayesians vindicate that the subjective legitimacy of degrees of beliefs as an exponent might suit the probability calculus’ axioms. We may have different subjective probability calculus to the same factor, like statistical review on a particular case (predicting the intensified Covid-19 virus eruptions in India). An increase in the conditional probability factor and the revision of one’s degrees of belief depend on the evidential supports.

The general laws of science or scientific theorems could be falsified if the basic sentences are deduced from the theories that set out the wrong prediction. Even the same case may be pertinent in the case of inductive logic where the prediction of the theory needs to be to get a true conclusion. Let us take the paradigm case of the science, i.e. the Law of Universal Gravitation that is not traced in the domain of Popper’s falsifiable hypothesis. In the last over two hundred fifty years, scientists are yet to entail prediction from the Law of Universal Gravitation to make the theory falsifiable. So the Popperian account cannot ramify a correct analysis either on this scientific methodology or its practical application. Putnam raises a severe objection against Popper by defending the account of primary practices in science as any scientific ideas conduit its practical application in science, technology, and human life. Even in our practice, we could find out the correctness or the failures of an idea to see its successful long run practice or its unsuccessful and insignificant application in our daily life. The whole process of knowing the significant or insignificant practices of the theories or ideas in the private life of the human could be understood only based on experience. Putnam argues, ‘In this sense ‘induction is circular’. But of course it is! Induction has no deductive justification; induction is not deduction. Circular justifications need not to be totally self-protecting nor need they be totally uninformative: the past success of ‘induction’ increases our confidence in it, and its past failure tempers that confidence. The fact that a justification is circular only means that that justification has no power to serve as a reason, unless the person to whom it is given as a reason already has some propensity to accept the conclusion. We do have a propensity—an a priori propensity, if you like—to reason ‘inductively’, and the past success of ‘induction’ increases that propensity. The method of testing ideas in practice and relying on the ones that prove successful (for that is what ‘induction’ is) is not unjustified. That is an empirical statement” (Putnam 1984: 375).

Logic, intuition, and experience that one would be conclave in science generate a great methodological concern in science. Darwin’s development in science does not debunk a clear-headed scientific innovation while his methodological milieu sophisticates a line up where the secrets of nature reveal through observation, colligating, doggedness (hypothesis of the inherited diffusion of acquired characters), and noticing the exceptional findings. A similar anti-inductive stance has been well-appreciated by Einstein who writes, ‘There is no inductive method which could lead to the fundamental concepts of physics’ (Einstein 1950: 76) He never slurs over the import of deduction and intuition in the growth of exact science. In his path-breaking work Relativity Einstein argues, “Guided by empirical data, the investigator rather develops a system of thought which, in general, is built up logically from a small number of fundamental assumptions, the so-called axioms. We call such a system of thought a theory. The theory finds the justification for its existence in the fact that it correlates a large number of single observations, and it is just here that the ‘truth’ of the theory lies.” (Einstein 1962: 123–124).

Actually a trenchant philosophical comprehension of science outsets an epistemological pledge of empiricism, an empiricist understanding of theory that looks quite apposite to the reason of the Hypothetico-deductivist trend of Popper. A post-Popperian account of doing science replaces different models like Kuhn’s paradigm shift (Kuhn 1962) and Lakatos’s scientific research programmes (Lakatos 1970). These are undoubtedly the methodological apprehension of demining the theory-centric view of science. Popper’s account seemingly enterprises a nihilistic plea where the scientific researches instigate falsifiable conjectures that could not apprehend any positive ventures of vigorous scientific theories. Popper aims to secure this objection by arguing that the foundation of the logic of scientific discovery relies on falsifiable conjectures. The reasoning judgment could not envisage the future by the whiff of ungrounded concomitance. The growth of genuine scientific knowledge does not advocate the immune nurturing of anything in support of the theory. A reasonable willingness to recognize any theories that have the ‘potential to be refuted by observational evidence’ lifts for the growth of scientific knowledge. But following Kuhn one might claim that an objective observational decision could not be a yardstick of the scientific theories. It lit on the contention of incommensurable aspects that qualified to the presuppositions, and social settings in the stage of ‘puzzle-solving activity’. Kuhn writes, “normal science ultimately leads only to the recognition of anomalies and crises. And these are terminated, not by deliberation and interpretation but by a relatively sudden and unstructured event like the Gestalt switch” (Kuhn 1962: 104). He considers that neither any comparisons nor empirical evidences could induce a paradigm as flawed. The process of paradigm relation seems a three-laired procedure that involves a set-up of paradigm, a rivalry paradigm, and observational evidence.

The underdetermination of theory by observational data (UTD), an account that has long been advocated by instrumentalists and later revived by Duhem and Quine as ‘Duhem-Quine thesis’ exposes that the growth of scientific theory does not opt for the self-preserved prediction but appreciates an auxiliary hypothesis as a conjunction that may lift a prediction in general. The striking point is that the mode of falsification of the prediction by observable evidence cannot disprove the scientific theory. The scientific theory remains retained while the conjunction of theory and its conjoining auxiliary may turn out as flawed. Quine envisages, “Any statement can be held to be true come what may, if we make drastic enough adjustments elsewhere in the system. Even a statement very close to the periphery can be held true in the face of recalcitrant experience by pleading hallucination or by amending certain statements of the kind called logical laws. Conversely, by the same token, no statement is immune to revision.” (Quine 1961: 43).

The growth of scientific knowledge pitfalls two diverse but entwined dimensions—in the prospect of discovery scientific conjectures rely on the intuition and deductive method while the context of its corroboration seems to experience or test centric. We have to elicit the logical constriction of the world and justify science debunking in sense-data (actual and possible) that is autonomously testable. Putnam spells out, “Deflationism about reference typically goes with deflationism about truth. If saying that representation is a relation between organisms (and states of organisms and, derivatively, bits of language) and real things, properties, and events is ‘representationalism,’ then representationalism is no sin!” (Putnam 2016: 40) A celebrated dictum (realism with a human face) casts a retrospect on Wittgenstein’s representationalism who once says, Even the hugest telescope has to have an eye-piece no larger than the human eye.

Synchronizing the logical aspect in reality with a representation of reality in human language, hitherto I find it’s appealing to philosophize science from a structural cum methodological ends. An evasion of philosophy-free science makes the process of knowledge of science a stupor; to find out coherence between observation and reason is the true manifestation of scientific pursuit. The progression of science tends to highlight different schemata like logical, cognitive, and social. The cognitive schemata give an account for a set of mental representation that interlinks to the mental procedure. Acquired beliefs in science are an adopted upshot of these mental representations and the mechanism of the cognitive procedures that uphold social interests and power connections. This integrated cognitive-socio schemata amends the scope of logical methods in scientific practices since the array of logical schemata looks more trivial than the prospect of psycho-social retrospect. Any plausible scientific evolution in experimental science tries to map the analogical thinking of the psycho-social account like a pragmatic utility, human preference, requirement, prerequisite, and the appraisal of cognitive and social stratagems. Maxwell once says, ‘our task, in engaging in rational inquiry is to see, participate in, and help to grow what is significant and of value in existence in the cosmos.’ (Maxwell 1984: 8) The descriptive and considerate explanatory parts of the scientific conjectures do not set apart the rudimentary constituent of human values in its implementation. A desirable scientific theory promotes its observational testing, fecundity, good track-record, internal consistency, clarity, compatibility, novel prediction and quantitative predictions, etc. But some philosophers like Paul Feyerabend (Feyerabend 1993) rebut the method, and consider that no general method of scientific judgments to be exceptionless. The method looks as if correct, but the implication of the method in scientific knowledge turns out to be negative. One could amend the stance of upward scientific view cumulatively and cogently since we could not sensibly endorse any scientific knowledge to establish or reject a theorem. The accumulation of scientific knowledge could not uphold a priori method but appreciates a testable an a posteriori method. The question of reliability tinges on a formation where testing, observation, and further development are effectively accessible. We have to judge each case separately as science does not appreciate any tied and rigid particular method or uniform enterprise. Jerry Fodor explores, “Every science implies a taxonomy of the events in its universe of discourse. In particular, every science employs a descriptive vocabulary of theoretical and observation predicates, such that events fall under the laws of the science by virtue of satisfying those predicates.” (Fodor 1999: 432).

The structure of science contains many epistemological purviews like the hypothesis of ‘projectibility’, a theory-laden methodological consideration that safeguards the background theories upon which these conjectures are relied on seem to be approximately true of unobservable (as well as observable) entities. A similar point that I would like to advance here is, in principle, a holistic approach to scientific conjectures that goes beyond the restricted principle of experience or a testable hypothesis to find out the validity of a chain of logical reasoning (deductive or inductive) of scientific statements. I think the mould of scientific growth integrates a new dimension that loads a background enabling conditions (sometimes inductive and sometimes deductive reasoning), objective referents, causal explanatory efficacy, experimental methods as hypothetical tools, the loom of falsifiability, along with efficiency towards a new conjecture of progressing from problem to problem. The reality is beyond the qualm of objectivity or conceptual schemata, actually, here, the observed things reveal in the preferred methods that sound indispensable for uncertainty and falsifiability.