Take the following expression:
a AND (b OR c)
a
must be true for the expression as a whole to be true.
Now take this expression:
(a AND b) OR (c AND ((a AND d) OR (a AND e))
In this expression, too, a
must be true for the expression as a whole to be true.
I have searched far and wide and, as far as I can tell, there is no name for such a term in logic. Am I missing something obvious? It’s akin to a dominant node in graph theory, the concept of which shows up all the time, but…as far as I can tell, this doesn’t come up enough in logic to have its own name.
Am I missing something obvious? I’m flabbergasted.
EDIT: To be clear, my surprise isn’t that such terms are obvious, or that other people haven’t noticed these things (they have): my surprise is that there apparently isn’t well-accepted nomenclature for such things.
I think the term is “necessary condition”. I haven’t done the calculation, but I suspect the latter is equivalent to
a AND (b OR (c AND (d OR e)))
And, since you don’t care about the overall result of the expression of a is false, then you can consider
a IMPLIES (the rest)
At which point a is a necessary condition
(Edited, and I haven’t checked my sources, sorry)
Yes
No, because “not a” satisfies “a implies *”
You could factor a out though, “a and (b or (c and (d or e)))”
Saying “a IMPLIES (the rest)” is saying that a is sufficient for the rest to be true. It doesn’t mean that a is a necessary condition.
Ah, yes — I did have that backwards.
Let P = (a AND b) OR (c AND ((a AND d) OR (a AND e))
It can be derived that P → a, and that a is a necessary condition of P.
a → P would indeed say that a is a sufficient condition for P, and that is definitely not the case.
That would be true if
a
is false.You mean, like in
a IMPLIES (a AND (b OR c))
? If so, this is not true in case ofa = true
,b = c = false
, in which case the expression is simplified totrue IMPLIES false
, which is false.For both of these expressions, it is true that
EXPR
impliesA
(written asEXPR -> A
), meaning that ifEXPR
is true, then for sureA
must also be true. Several other posters did mention “necessary conditions”, and yes, this is the correct term for this. Otherwise, you could simply say thatA
is implied byEXPR
.Also, a nitpick: it doesn’t seem that predicates factor into this, so this looks like a question about propositional logic.
When we were converting a large scale procedure based sql program into a single select statement where the conditions represented lambda expressions, I remember calling them hinge terms, because they needed to be factored as the first term of each sub-expression and thus the rest of the expression hinged on it. But I can’t find a reference to it.
That simplifies to
a ⋀ (b ⋁ (c ⋀ d (⋁ e)))
, right? I’d just think of it as “a variable that needs to be true”. Maybe a fixed variable, since we know it must be a specific value?Right. My surprise is simply that there isn’t already a name for such a thing, apparently.
Maybe my work has become too focused, but finding such terms in large and complicated expressions is an important part of my job, and I’ve never had a good way to refer to them…I call them “alpha terms” or “alpha predicates” by analogy with the similar objects from Rete networks, but that’s just my own personal name for them.
What value would we get by coming up with a term for this? I ask this because usually mathematical concepts are isolated when they prove useful to reasoning or creation of other constructs.
Say for example you have a whole lot of predicates you need to test, thousands or millions, and they are of arbitrary complexity, often with terms repeated throughout.
If there are these kinds of terms, you could pull them out and test them all at once. Any term that failed would indicate that the associated expression(s) could be ignored entirely and removed from consideration.
In other words, such terms can be used to fast-fail tests involved huge numbers of terms and predicates.
I’d like to know if there’s a word for them so that the above explanation can be given more concisely.
Could the already existing “factorization” be such a term ? Since the factorization of “A and” is possible for all terms !
The reason that makes me choose the word factorization, is that I feel that the overall complexity for encoding the logical expression is smaller once you get the A out of all other expressions. I should measure that though.
But it is the same overall feeling that I get from product factorization for polynomials: reduction of complexity/redundancy for the equivalent resulting expression. I understand (x+1) (x-2) while most of the time I don’t necessarily understand as clearly the developed form of the expression.
Indeed, I would probably reach for “factorization” or “simplified form” myself.
I’d just refer to
a
as a consequence of the larger expression.I just did a search, and apparently in the context of boolean satisfiability such variables might be called the “backbone” of the problem (https://ptolemy.berkeley.edu/projects/embedded/eecsx44/fall2011/lectures/SATSolving.pdf), though it’s not clear to me that it’s a super-common use.
Is this the absorbing element?
The opposite: “false” is the absorbing element for boolean and, but here we’re talking about “true” - which the identity element for binary and.
Ah, thanks.
A formal definition for such a term would be something like:
“Let X be a logical expression with zero or more variables. An ‘alpha term’ is any variable A such that when X is in disjunctive normal form, A appears unnegated in every disjunct.”
I should write a paper. :)
There is nomenclature for this, and you just have to apply a “higher-level” concept, namely a material conditional. The expression ¬(a→b) is equvalent to a∧¬b (a not implying b is equivalent to a being true and b not being true). In your case, we can translate your expression as follows:
For that to be true, the expression a→(¬b∧¬c) must be false, which can only happen if a is true and (¬b∧¬c) is false (i.e. b or c is true) (given the expression (t→f) is the only true-false-combination in the material conditional which yields false).
Your second expression is even simpler (counterintuitively), using the fact that a→b is equivalent to ¬a∨b:
For that to be true, a being true is a necessary condition, otherwise the left side would always be true and the right side would always be false, and the expression (t→f) is always false, yielding your result formally. If you don’t like going a level up, you can also write down a truth table for all involved parameters. It’s a valid form of proof.
I hope this helped. :)
So
b
orc
must be true.Then
b
as well, or that other thing.Isn’t that just kind of nonsensical? Also, naming a thing is (imo) less important than defining it :P and how would this be defined as?
a AND (b OR c)
Hmm. Here, I’d say
a
must “obviously” be true (for the whole thing to be true) because it’s the “leading term”, which I admit is not some standard phrase (but what else would you call it?).(a AND b) OR (c AND ((a AND d) OR (a AND e))
Less obvious, but you can “factor out” the
a
, so to speak, to be the leading term again. If I skip a step here or screw up, I apologize(a & d) | (a & e) => a & (d | e)
c & (a & ..) => a & c & (..)
(a & ..) | (a & ....) => a & (.. | ....)
so, => a & (b | (c & (d | e)))
Then, since it’s the leading term, again I argue it’s obvious. My brain seems to read it left-associatively, but English is famously left->right so maybe for others, right-assoc. Maybe
... & a
is clearer to some. But you’re right, if I can’t call it the leading term, and it’s buried, I don’t have a word. A necessary input, required, discriminant, something along those lines.In CNF the second formula is
This form makes it obvious that a must be true.