Quote Originally Posted by Wesker View Post
u gotta be kidding me, its all about the internet speed, doesnt mater if u have or not a virtual machine, and even if you got 1, its just stupid to run a game in a virtual machine, so your telling me that league of legends runs better on a virtual machine ?

for the love of RA


its the internet, doesnt matter if you got the dx9, dx5, dx12, or opengl if you're provider is a piece of shitt you're not going to get more than 50 fps

the only thing that may work and give you higher fps is a proxy server, and even that relies on your internet speed........

dont play games in a 56kb modem they are outdated.


you know what fuck it, in wonderland everything is posible heres how jash doubles hes internet speed http://www.youtube.com/watch?v=lG5cEik2ABY



windows reserves 20% of the bandwith if you search the gpedit and go to the Qos you can set it to 0, and thats should make a real difference between fps not all the shitt people talks here.


fuck it every1 double internet speed
Ok so I'm gonna assume you're not joking and hopefully educate you a little.

Your internet speed can be measured in two ways. One of those ways is latency, the other is file transfer speed. Latency is the amount of time a packet takes between leaving your machine, and arriving at a server (and usually the time it takes to return, as well). It is measured in milliseconds. Transfer speed is the amount of data you can transfer to or from (upload or download, respectively) your machine from or to a server. Let's explore how Tibia works for a minute.

The tibia client uses a graphics engine. That graphics engine generates images and prints them to the screen. The graphics engine generates this data as fast as requested to (IE at for instance, 30 fps), but if it cannot achieve the requested speed, it will slow down. For this case, you can consider the graphics engine to be simply a GUI, displaying things as it is told to do so. The graphics engine generally will handle all rendering, and each "action" (for instance printing text to the screen) is done by the graphics processing threads in Tibia in one go.

Now, behind the GUI in most (if not all) applications, you will find some code which the end user will never really know at all. I mean, they might know roughly what it does, but they will not know exactly how it does it (not without reverse engineering it. That code generally handles all the workload of everything non-GUI in the program. In this instance, it will be work like calculating numbers to display on the screen (e.g using a formula to calculate your current level, by working with an experience to level formula), and TCP communications.

What we're wanting to look at is TCP communications. TCP is basically a standardised system for sending data over the internet. You send and receive data in a certain format, to keep it standardised. This data will contain entire commands, such as what we could pseudo into something like UpdateHealth(200), to update your health to 200. Once the TCP client (Tibia) has received this data, it will issue a command to the GUI to update the health bar and battle list entry for HPPC.

So basically what I'm saying is this, you will receive a general packet with a command, that command will be processed by a back end TCP client, and it will call some function in the GUI to make changes to what the player sees. The link between FPS and connection speeds is rather non-existent. The work done by the GUI is issued in one go (ie the client sends a walk right packet, server receives it, server returns a packet to accept that client has walked right, and client acts upon it by moving the character one square to the right). So that means that the client receives one packet only, per issued command? Well, what if you were to only send / receive that 1 packet? The client would continue about processing that command until it is completed. That means that once the command is issued to move your character X pixels to the right, it is processed fully regardless of other TCP commands.

What you're implying is that each frame is only processed when a new packet is received. That's simply not true. If CIP were to send out a packet for each pixel a character / creature moved, it would be at least 100x harder on their network and servers to process the data. It's very common in MMORPGs to get away with as much work being done on a customers machine as possible, because at the end of the day only so much can be done there. For instance, you can't handle the entire event which is raised when a creature is killed on a players machine. The main reason being of course it would then be entirely possible to simply inject into the client, run the code every 100 milliseconds to kill a demon or hellgorak or whatever, and take all the experience from it (potentially also duplicating items and moving walls, as it used to be in Tibia v6.1~).

I didn't finish this post due to RL.