My latest project involves developing a game where players have to guess names from images. Each game consists of 10 rounds, followed by a bonus round where participants must wager their points on guessing the age of the person in the image within a 2-year range. If they guess correctly, the wagered amount is added to their score; if not, their score decreases.
Recently, I encountered an issue with my code. While it functions correctly when guessing below or exactly the right age, it unexpectedly deducts points if the guess is 2 years above the actual age. This discrepancy has left me puzzled about what went wrong in the implementation.
Below is a snippet of the relevant part of my code:
var age = calcAge(celeb_q_a[celeb_q_a.length-1].dob);
age = age.toFixed(0);
alert(age);
var user_age = document.getElementById("answer-age").value;
var prev_score = score * 10;
if ((((age - 2) == user_age)) || (age == user_age) || ((2 + age) == user_age)) {
prev_score += (document.getElementById("wage").value * 1);
}else{
prev_score -= (document.getElementById("wage").value * 1);
}