Quote:
Originally Posted by markact
these help to demonstrate what I am missing in regards to partitioning the original set into 3 disjoint sets.
The following assumes a space of 4 points  a maximum of 16 dichotomies. I have labelled each row with its base 10 equivalence.
Here is alpha (N1 appears once, with XN either 1 or 0)
X1 X2 X3 XN ID
0 0 0 0 0
0 0 1 1 3
0 1 0 0 4
0 1 1 1 7
1 0 0 0 8
1 0 1 1 11
1 1 0 0 12
1 1 1 1 15
We are left with 8 rows remaining If these have XN being either 1 or 1, then these 8 rows are split into the following two partitions (S2+ and S2).
[S2+]
X1 X2 X3 XN ID
0 0 0 1 1
0 1 0 1 5
1 0 0 1 9
1 1 0 1 13
[S2]
X1 X2 X3 XN ID
0 0 1 0 2
0 1 1 0 6
1 0 1 0 10
1 1 1 0 14

Thank you for the detailed example. I can now see where the misunderstanding is. The set
contains all the patterns that appear once, and none of the patterns that appear twice, of the original matrix. Therefore, if all the above patterns are indeed in that matrix, then the decimal pattern 0 should not be in
since the decimal pattern 1 makes that pattern appear twice (on the first 3 columns). Similarly, the set
contains all patterns that appear twice, and none that appear once.