There is a problem with updating the classic specific files
If you are in a multi-gigabyte code base (like Visual Studio or the Windows source tree) then your client does not need to scan your local filesystem, looking for files that may have changed, because the contract you have with TFS is that you will explicitly check a file out when you want to edit it.
You are expected to mark a file as write-only and change it without explicitly checking it out first.
If you play by its rules and actually check things out before editing them, you don't confuse matters, and "get latest" really does as it says.
As you've seen, you can force it to reassess everything, which has a much greater bandwidth usage, but behaves closer to how Source Safe used to. TFVC Server Workspaces are a "checkout-edit-checkin" type of system where this is by-design, an intentional decision made to massively reduce the amount of file I/O required to determine the state of your workspace.
Instead, the workspace information is saved on the server.
This allows TFVC Server Workspaces to scale to large codebases very efficiently.
In fact you might have deleted the entire folder (as in my case) and TFS won't fetch the latest copy because it does not look into the actual file but the hidden directory where it records changes.The solution was removing the subfolder mapping using the "manage workspaces" window.The silliest thing I see is that "Get Latest Version" does nothing even when the local file has been deleted for whatever reason.TFVC Local Workspaces are the default in TFS 2012, and if they are not enabled for you, then you should ask your server administrator.(Organizations with very large codebases or strict auditing requirements may disable TFVC Local Workspaces.) Eric Sink's excellent book Team Foundation Server (TFS) keeps track of its local copy in a hidden directory called $TF.