When a person dies, it used to be that a family member would call up Facebook to tell them, at which point his/her Facebook account becomes memorialized and non-functional but remains ‘up’. Now things have changed with ‘legacy contact’.
It seems that even space exploration giant and automobile pioneer Elon Musk has a pet peeve or two of his own. One of them happens to be Artificial Intelligence(AI). The entrepreneur exclaimed at MIT’s Centennial Symposium that “I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful.” Musk mentioned that at the very least there should be some regulatory oversight so that humanity does not get itself killed.
Lets today learn one concept and straight away implement it some real problem. The concept to learn is Backtracking.
This is not a new concept to us. We use this, follow this in our day to day life. This only proves that Computer Science and its concepts are very well related to real world only. Don't believe me?
Have an example: Suppose, You are at point A and want to reach at B. There are 3 roads say X, Y and Z to follow from A and you have no idea about which is the right path. What you will do? You will choose any one path out of three say X. But you eventually found that X does not lead to B. What will you do now? You will come back to A and follow path Y. Is it not right? Is it not what you do? And, for your information, you just followed the principle of Backtracking.
It has emerged that Google has been tracking Smartphone users everywhere they go, indicating it with a red dot on a map to mark and make their location much clearer for identification.
If you do not believe this, you remember what appeared in the scene of the Minority Report, where Tom Cruise is on the run from the law, but is unable to avoid detection because everywhere he goes, there are constant retina scans feeding his location back to a central database? That is exactly what we are talking about here.
This post is a simple example of deception. It shows how a programmer can defy the very important rule of having a main() in c program and still make the program run. This illustrates the concept on a simple program though it can be scaled to much bigger and more complex programs.
#define decode(s,t,u,m,p,e,d) m##s##u##t
#define begin decode(a,n,i,m,a,t,e)
printf(” hello “);
This program runs without main().
B.Tech. (Software Engineer),
Help Us To Grow