April 8, 2012

Most efficient way to LIMIT results in a JOIN?

Question by johnnietheblack

I have a fairly simple one-to-many type join in a MySQL query. In this case, I’d like to LIMIT my results by the left table.

For example, let’s say I have an accounts table and a comments table, and I’d like to pull 100 rows from accounts and all the associated comments rows for each.

Thy only way I can think to do this is with a sub-select in in the FROM clause instead of simply selecting FROM accounts. Here is my current idea:

SELECT a.*, c.* FROM 
    (SELECT * FROM accounts LIMIT 100) a
    LEFT JOIN `comments` c on c.account_id = a.id
    ORDER BY a.id

However, whenever I need to do a sub-select of some sort, my intermediate level SQL knowledge feels like it’s doing something wrong.

Is there a more efficient, or faster, way to do this, or is this pretty good?

By the way…

This might be the absolute simplest way to do this, which I’m okay with as an answer. I’m simply trying to figure out if there IS another way to do this that could potentially compete with the above statement in terms of speed.

Answer by Priyank

Looks perfect to me.

Just wondering if it is ok with you to receive less or more than 100 rows from the above query.
Because if for a particular row of accounts table there are no rows in comments table then you would get less than 100 rows. Similarly if there are multiple matching rows in comments table for a row in accounts table then you may get more than 100 rows.

See: How can I optimize this query, takes more than a min to execute

Answer by Starx

No, The query is fine, just the way it is. You might want to filter the fields instead of a.* and c.* though.

March 26, 2012

Which performs better in MySQL?

Question by johnnietheblack

I would like to update 3 rows of a single table, with different values each.

The way I see it I could either…

Option 1.
Simply create a PHP loop, and run a query for each row I’d like to update. (this is the ‘classic’ approach, maybe).

Option 2.
Follow the approach of combining into a single query, but that uses syntax that I have heard is slower, depending on the situation. (example copied from: this website):

UPDATE mytable
SET myfield = CASE other_field
    WHEN 1 THEN 'value'
    WHEN 2 THEN 'value'
    WHEN 3 THEN 'value'
END WHERE id IN (1,2,3)

Which is a better, faster option?

Answer by Daan

I’m almost certain the single MySQL query will be slower than separate UPDATE queries, but you could always try. If immediate consistency is not a concern, you could also consider using individual UPDATE LOW_PRIORITY queries which should be even faster still.

The best way to know what works for your application, as always in programming, is to benchmark the performance of both queries and experiment a bit 🙂

Answer by Starx

Single query will be far more faster than querying the server multiple times in a loop.

So, second way is the way to go.

November 29, 2010

What sort of page weight / size should I be aiming for when optimising for mobile?

Question by Simon Scarfe

I’m looking to optimise the mobile browser experience in a small webapp, using the awesome jQuery mobile to do so.

It goes without saying that a user doesn’t want to DL 200k of data, I’m just trying to draw the line between using external and internal URLs. Is there any existing guideline of what sort of page sizes / loading times I should be shooting for? I’d prefer to stick to internal URLs (keep the mobile interface effectively in one place from a maintenance point of view), but am weary of bogging the user down with lots of information that they have no intention of viewing.

Answer by galambalazs

One thing to note is that e.g. iPhone won’t cache components bigger than 25K.

Also consider minifing and gzipping your code. And @jfar got one thing right. HTTP requests can be a huge performance hit, so you may also want to reduce them as much as possible.

Answer by Starx

I would say try to lower your page as much to 100kb, because if u manage to do so loading different pages as per user request might be negligible.

However, loading the big pile of content at a time and using simple javascript show and hide, to display the content in almost realtime manner might impress the viewer too.

July 28, 2010

What's faster/more efficient: Continuous Deleting OR Continuous Updating + Intermittent Deleting?

Question by user371699

I have a cron that runs through many rows, deleting the “bad” ones (according to my criteria). I’m just wondering what would be the best to optimize the script. I can do one of the following:

  1. Have the same cron instantly delete the “bad” rows upon finding them.

  2. Have the same cron instantly update the “bad” rows to status “1”, meaning bad. Then, I can set up another cron that runs onces an hour deleting all rows with status “1”. To make it faster, I guess I’d have an index on “status”, but that might also ruin performance.

Any suggestions?

Answer by Juergen Hollfelder

I am not experienced in mySQL but on other DBMS I worked on an update and then delete does not help. Just try with huge amounts of data and measure the time on the delete versus update+delete. It helps if the columns that act as a criteria for the “bad” ones have an index.

Answer by Starx

IF you are thinking of updating a row as bad and then later deleting it, you are putting extra pressure on the server.

Deleting them directly is the better option.

If you think there are going to be huge amount of bad rows, make a cron in such a way that it does not delete more than 100 rows at a time. That should limit the server load to some extend.

...

Please fill the form - I will response as fast as I can!