There are rumors that there are a lot of computers having a problem with the year 2000. As they use only two digits to represent the year, the date will suddenly turn from 1999 to 1900. In fact, there are also many other, similar problems. On some systems, a 32-bit integer is used to store the number of seconds that have elapsed since a certain fixed date. In this
way, when 2^32 seconds (about 136 Years) have elapsed, the date will jump back to whatever the fixed date is.
Now, what can you do about all that mess? Imagine you have two computers C1 and C with two different bugs: One with the ordinary Y2K-Bug (i. e. switching to a1 := 1900 instead of b1 := 2000) and one switching to a2 := 1904 instead of b2 := 2040. Imagine that the C1 displays the year y1 := 1941 and C2 the year y2 := 2005. Then you know the following (assuming that there are no other bugs): the real year can't be 1941, since, then, both computers would show the (same) right date. If the year would be 2005, y1 would be 1905, so this is impossible, too. Looking only at C1 , we know that the real year is one of the following: 1941, 2041, 2141, etc. We now can calculate what C2 would display in these years: 1941, 1905, 2005, etc. So in fact, it is possible that the actual year is 2141.
To calculate all this manually is a lot of work. (And you don't really want to do it each time you forgot the actual year.) So, your task is to write a program which does the calculation for you: find the first possible real year, knowing what some other computers say (yi) and knowing their bugs (switching to ai instead of bi ). Note that the year ai is definitely not after the year the computer was built. Since the actual year can't be before the year the computers were built, the year your program is looking for can't be before any ai .