Well, technically you won't see much of a difference visually between 30 and 60 FPS. The only difference is that if someone moves really quickly, you'll likely seem them move a portion of a second earlier because at higher frame rates.
Just because a game says it's refreshing at whatever frame rate doesn't mean that you can see it. A monitor has a set refresh rate, which means that (usually) ever 1/60 of a second, whatever the game reports is going on, will be displayed. When a game reports it's frame rate, that's how fast the game can update what should be shown on the screen, but that doesn't mean that the screen is displaying every frame that the game is. It simply takes a snapshot of the video info and displays it every update.
The reason you don't see any flickering when you look at a screen is because any image you see tends to stay on your eyes for about 1/5 of a second (IIRC), which means that all the images from the monitor are overwriting each other within that time frame. If you were to have a monitor refreshing at 5 frames a second or slower, you can very easily see a flickering in it. In any case, between 0 and 30 frames a second (game frame rate) you can see the game be choppy and stuff. Above 60 FPS, unless your monitor's refresh rate is set to 70 or 72 FPS, then you won't see any difference.
All of this is assuming you have V synch on (which means the game only updates it's screen when the monitor is supposed to, it prevents issues like tearing and takes some of the stress off your video card and processor).
If you want to see what your monitor is capable of, you need to go into the advanced video settings stuff (you know, the stuff for your drivers instead of the windows thing), and try to change the refresh rate settings. Of course, make sure that it asks you to confirm that it's working, since if you set the rate too high, your monitor will go black, or display an error. Usually 70/72 Hz refresh only works with the low resolutions.