Tom wrote:
The point is that you can take a knapsack problem (in NP) and convert it to a dog problem, so if you can solve the dog problem, you can solve the knapsack problem. = chasing windmills
Hmm... I'm starting to doubt that this reduces to a 1-D knapsack.
In Garthor's example, I believe he tried to reduce the coordinates from (smart, funny), to (smart, magnitude-smart). I'm trying to cogitate if the base dog (smart*, 0) is proper...
Bootyboy wrote:
Hmm... I'm starting to doubt that this reduces to a 1-D knapsack. Would love to see you drop by chatters and think in public :] |
My solution (via bridging rules)
1. Eliminate requirement that the sum of the dogs characteristics cannot be negative. Partly because a -998 dog is better at something than a -999 dog and this logically suggests that some level of that trait exists in positive abundance despite the negativity of the scale. (If you just wanted to convert the whole thing to positive and use the unbounded knapsack solution here, you could, but I don't think the answer would be as good as if you take the rest of the rules into consideration.) 2. Ask Bessy to provide a value representing the importance of smartness/funniness in the dogs with an understanding that it is highly unlikely that smartness is really of equivalent value to funniness. This is based wholly on her subjective opinion, which is inseparable from her choice of trying to find the "best" candidates to showcase. In doing this, Besse is deciding which is more important to her. 3. Perform a simple sort to organize the list of candidates based off of the resulting highest sums. She may end up with dogs who don't have a lick of smartness/funniness in them, but the computer will give her a good idea of the highest overall value of the dogs based off of these traits, insofar as she's able to measure them, and weighted with her subjective opinion of the value of each. If Bessy wants to put her best foot forward at the dog show, it's not hard for the computer to guide her, once she figures out exactly what that constitutes without such vagueness as "smartness = funniness, except when I'm showcasing the dogs, in which case smartness != funiness as one or the other cannot be negative." Silly girl. Of course, since we couldn't solve this problem without such bridging rules, clearly none of us are real programmers. ;) |
Well, to disprove that you just have to find an example of a knapsack problem that doesn't convert to a dog problem in such as way that they don't have the same optimal pairings. As far as I can tell, the only issue with Garthor's conversion is when the weight scale W dwarfs the value scale V, since you might only take the first dog (0,X) or some shorter set as the optimal solution. But this is easily remedied by just changing the scale or making the values = X+V so that it is always prudent to select them. In that case the goal would be to simply select enough dogs to minimize W (X-W), which is exactly what the knapsack problem is.
That said, there's nothing wrong with using practical heuristics or combo brute-force solutions to problems in NP, especially when the number of items is so small (<100 in this case). I'm certain this is perfectly solvable in relatively short computer time. But a non-polynomial solution should probably be mentioned upfront. |
I don't exactly understand what it is we're supposed to be doing here? Take a list of groups of dogs, then select which ones should go to the exibit?
|
Tom wrote:
Well, to disprove that you just have to find an example of a knapsack problem that doesn't convert to a dog problem in such as way that they don't have the same optimal pairings. As far as I can tell, the only issue with Garthor's conversion is when the weight scale W dwarfs the value scale V, since you might only take the first dog (0,X) or some shorter set as the optimal solution. But this is easily remedied by just changing the scale or making the values = X+V so that it is always prudent to select them. In that case the goal would be to simply select enough dogs to minimize W (X-W), which is exactly what the knapsack problem is. This is true. Just to make sure I understand what you're saying, there are assumptions and approximations that can be done with some datasets where we can fully apply the knapsack model. I would agree with this. Would be interesting to change that threshold of "large W" over the length of the algorithm. Gives it more of an overarching annealing-algorithm-feel to it. Then again with such a small dataset, an overly large W would likely be solvable with a one-pass algo... |
I'm saying (via Garthor) that every knapsack problem can be turned into a dog problem, so if you can solve that dog problem in polynomial time, you've just proved that P=NP. His knapsack->dog just needed a minor scale/additive factor (eg, instead of making the mappings (V,-W), you'd make them (V+X,-W).
|
Tom wrote:
I'm saying (via Garthor) that every knapsack problem can be turned into a dog problem, so if you can solve that dog problem in polynomial time, you've just proved that P=NP. His knapsack->dog just needed a minor scale/additive factor (eg, instead of making the mappings (V,-W), you'd make them (V+X,-W). I'm not sure if I follow. (V+smart, -W) makes one of the terms two independent variables with their own implicit relationship -- not a valid condition of the 1-D knapsack. |
I wrote up a dynamic programming solution to this, but again, the memory requirements are insane. You would need around 10 terabytes to do it with 100 dogs with value ranges of -1000 to 1000.
|
Also, an alternative for the Knapsack is to state it as the decision problem "can a value of at least V be achieved without exceeding the weight W?" (thank YOU, Wikipedia!) In this case, the first dog would be (-V, W) and the rest would be (Vi, -Wi). In order to take anything, you would need to take the first dog and some set of the other dogs.
You would also need to define taking no dogs at all be to an invalid answer, as otherwise its output is the same as taking a perfect balance. |
Garthor wrote:
I wrote up a dynamic programming solution to this, but again, the memory requirements are insane. You would need around 10 terabytes to do it with 100 dogs with value ranges of -1000 to 1000. Now we just need someone with 10 TB's of hard drive space in some way shape or form to test the program. If we're lucky, we'll know if it works in a week... |
Bootyboy wrote:
(V+smart, -W) makes one of the terms two independent variables with their own implicit relationship -- not a valid condition of the 1-D knapsack. I think this is getting lost in the semantics, so let me just give an example. Suppose you have an arbitrary knapsack problem: Tolerated weight: X = 100 Item 1: value V1 = 5, weight W1 = 50 Item 2: V2 = 3, W2 = 30 Item 3: V3 = 3, W3 = 30 Item 4: V4 = 3, W4 = 30 Item 5: V5 = 1, W5 = 10 The solution would be to take {2,3,4,5} with net V = 10 Now we translate this into the dog problem, which is actually a much broader case of the knapsack (hence the gut feeling that it too was NP-complete before seeing Garthor's mapping): F0 -> 0, S0 -> X Fi -> X+Vi, Si -> -Wi Dog 0: fun F0 = 0, smarts S0 = 100 Dog 1: F1 = 105, S1 = -50 Dog 2: F2 = 103, S2 = -30 Dog 3: F3 = 103, S3 = -30 Dog 4: F4 = 103, S4 = -30 Dog 5: F5 = 101, S5 = -10 The solution here is to take Dogs {0,2,3,4,5} which translates to a knapsack of {2,3,4,5} since Dog 0 is the "knapsack". Since any knapsack problem can be mapped in this way, if we can solve the dog problem in polynomial time, we can solve too the knapsack problem (or, in fact, any other previously NP-complete problem). |
Oh right, forgot I didn't actually post my algorithm:
/* Stupid Dogs */ |
Ya Diggg wrote:
What the fuck. I just got a headache looking at all of this -.- Don't let it bug ya. They're just trying to figure out how to tell an overgrown calculator how to sort a list to some delightfully whacked criteria in such a way that said calculator doesn't spontaneously combust. While I proposed good enough of a workaround by taking the otherworldly qualities out of the problem, such a move merely undermines the value of an excellent puzzle to a voracious intellect. |
Geldonyetich: proposing fundamentally changing the problem as a solution to the problem is very, very, very wrong.
Also: I never claimed that my would be particularly easy to understand. I suppose I could've thrown some comments into there but really, if you don't already know how dynamic programming works I doubt it'd be helpful. You need an example that you can actually properly visualize first. |
Garthor wrote:
Geldonyetich: proposing fundamentally changing the problem as a solution to the problem is very, very, very wrong. On the contrary, recognizing the problem in itself is wrong is the kind of right that stops space shuttles from burning up in the atmosphere. If you can't recognize the fundamental issue with measuring smartness and funniness numerically and then attempting a weighted decision based upon them without considering their comparative worth, you're in the running to design an excellent program that renders an irrefutably correct answer of completely worthless application. What I was saying just now is that you're willing to overlook this because the problem in itself is delicious to solve within these arbitrary parameters. Besides, it's not to say that there won't be a perfectly valid set of parameters to match the knapsack problem. |
The only sense in which this problem was "wrong" is that it was fundamentally impossible to solve given the contest's requirements (runtime of under 10 seconds on a 900 MHz Intel or somesuch). Again, you can't just pretend that the parameters of a problem are different because you can't figure out the answer and "Negative numbers don't make sense!"
Mentioning the space shuttle is quite a spectacular non-sequitur. If you weren't so insistent on your point I'd say that you were just being a troll. You're just so unbelievably insane. |
The datasets I provided were more two unoptimizable datasets... answers not necessarily important ;).