Angry Wrote:I was really pissed off to see this answer. It tells us that the @ means addition or multiplication and uses this uncertain property of @ to help us find a value for K. F this BS
dude, no. first close your eyes and count to ten while concentrating on tranquil wilderness scenes, and then read the problem again. it says, in no uncertain terms, that
k is an integer.
this means that, in statement 1, '@' can't stand for multiplication; if it did, then
k would have to be 1.5, a value that is prohibited because it's not an integer.
therefore, by process of elimination, '@' must be addition, and so
k must be 1. this is sufficient to determine the required value.
--
there's also one other thing wrong with your reasoning, in the case of the 'combined' statements: namely, you deduce that
k = 1, but then
you still think that @ can stand for either multiplication or addition.
it can't.
in both cases,
k can only be 1 if the '@' sign stands for addition, so, by setting
k equal to 1, you are also committing the '@' sign to stand for addition.
analogy: let's say y = x + 1, and you have other information telling you that x is either 5 or 8.
this means that y is either 6 (= 5 + 1) or 9 (= 6 + 1).
here's the deal, though;
you can't choose the values of x and y independently in this scenario. for instance, even though x can be 5 and y can be 9, you
cannot have both of those at the same time: if x is 5 then y must be 6, and if y = 9 then x must be 8.
same thing in the problem above: you can't have k = 1 without '@' standing for addition.