genome project journal #2: 1,000th video update

It was almost a year ago that I first announced the Genome Project on the .org. Since then interest in the project has been marginal at best, but I haven’t gotten any less excited by it over the past year. In fact, if anything, as time has gone on my excitement has just grown — I love working on it, and it’s been so cool to see the spreadsheet grow and mutate over time. At the time of this writing I have a number of different things I’m tracking throughout the spreadsheet, and it’s been really interesting to see certain things change, and certain trends emerge. As I continue to build up the project, I anticipate many more neat things will come into focus.

It’s still far too early to come to any definite conclusions, but in celebration of my having entered the 1,000th AMV into the database, I figured now would be a good time to take a short break and do some preliminary analysis, to see what is currently developing within the confines of the project. I would encourage you, if you haven’t checked on the project in a while (or ever), to download the spreadsheet and take a look for yourself at what’s in there.

Note: You’ll need OpenOffice to properly view the document. Also, it’s extremely important that you go to Tools > Options > OpenOffice Calc > Calculate and uncheck the box which reads “Search criteria = and < > must apply to whole cells” if it’s not already. Otherwise many of the graphs and formulas throughout the spreadhseet will not give correct information.

Before I go into this, a few basic things to keep in mind:

• For the most part, the really interesting stuff with regards to how videos have changed over time is still out of reach. I do not have a nearly even enough spread of videos across the last 15 years to draw meaningful comparisons — only 24 videos from 2010 vs. 173 from 2014, for example. Take anything below which deals with development over time with a huge grain of salt.

• …Actually, best to take pretty much all of this with a grain of salt at this point. 1,000 videos seems like a lot, but it’s really not at all in the face of the 100,000+ videos entered into the .org’s catalog. And I fear there may be viewer bias in terms of the types of videos entered into the database — I tend to prefer drama/romance/sentimental videos more than action, comedy, or trailer videos, for example, and this is by and large reflected in the list. Until there’s a more homogeneous mix of video genres, the legitimacy of any analysis can and should be questioned.

• I have done my best to compensate for this by always including both the raw count of videos as well as the percentage of such videos in any analysis below. Please always look at both numbers to get an idea of how trustworthy any given statistic is as being representative of the greater whole.

• At this time, I am not doing anything more than very basic statistical analysis — besides not knowing enough about stats to do anything too complex, I also don’t feel there’s enough raw data to justify it at this point. So we’re really just looking at basic counts and percentages right now, and doing basic comparisons between them. More complex stuff will (hopefully) come later, and (hopefully) with help from others who are more sophisticated than I am when it comes to this kind of thing.

Okay! Without further ado, here are a bunch of graphs and charts and numbers and commentary. Enjoy!

01 fx_non fx

Let’s start with something that should surprise exactly nobody: There’s a large and undeniable discrepancy between the number of videos that use effects and those that don’t. This is not something I expect will even out as time goes on — I fully expect this statistic to stay at this level or be even more exaggerated in the future.

This is because I am very strict with what counts as an “effect” in any given video. The only “effects” that I don’t count as effects are crossfades and time stretching/compression (because the former is too ubiquitous, the latter is too difficult to identify 90% of the time). Everything else — overlays, color changes, pretty much anything that’s manipulating the footage beyond its source is counted as an effect, and so the large majority of videos out there have effects, even if most of them are very simple.

This, however, is why the “simple” tag exists. Although perhaps somewhat subjective, I try to only tag videos with this if they are effects-less or if they use very basic effects, such as color changes, blurs, and overlays. These are the videos that most editors these days would probably call “effects-less”, although what they really mean is more along the lines of “not technical”. At this point, a sizable portion of the videos in the database have been tagged as being “simple” — 59.1% of them, in fact.

The “simple” tag, because of the large number of videos that have been tagged with it, allows some interesting analysis to be done, and as this is one of the things that interests me personally, I decided to take a further look at some of the things we can learn from this tag.

02 simple tag

The idea for these particular graphs comes from .org user furious purpose, who thought it would be cool to find out if people generally thought more highly of simple videos or more technical videos. I, too, thought this would be interesting to see so I ran the numbers and came up with the above two graphs. Now, a couple points of interest and things to note:

(1) There are significantly more star-rated videos for those videos with the “simple” tag than without. This is probably a reflection of the higher concentration of “simple” videos throughout the database, but please keep in mind that there are 131 less non-simple videos in this comparison.

(2) The most telling thing about this comparison is the “Percentage 3.75 and above” shown at the bottom of the respective charts — what these numbers tell us is that “simple” videos tend to be rated lower on the whole than those that are not simple. Although there’s a lower number of total “non-simple” videos, the sizable majority of them average a star rating of 3.75 or above. With the videos entered so far, then, people seem to like the more technically complex ones, at least at the time they rate the video.

03 simple by genre

There are a few similar graphs in the database, which you can explore at your leisure, but this graph shows what percentage of videos labeled as being various genres are also labeled with the “simple” tag. The results are pretty much expected, but still cool to look at. Unsurprisngly, perhaps, dance videos (which tend to use very “visual” music), parody videos (which often make use of extensive effects work to emulate as closely as possible whatever they’re parodying), and horror videos (which tend to make use of effects to help set a distinct mood) populate the lower end of the spectrum, while character profile, sentimental, and romance videos, which tend to rely much more heavily on storytelling elements rather than heavy effects work, are at the top.

One thing to keep in mind — and I’m not sure how this factors into the data analysis, if at all — but a single video is almost always categorized as being multiple genres (i.e. a video can be categorized as drama, romance, and serious). As such, this graph doesn’t represent a count of total videos, but a count of total genre listings. Therefore, single videos are represented multiple times throughout the data in this graph.

04 fx no fx the rest

A short and simple little chart here, but it shows that the majority of videos in the catalog so far lie somewhere between being completely effects-less and on effects overload, as would be expected.

05 star rating

Let’s move on now to a few graphs dealing specifically with star ratings. Star ratings are, unfortunately, about the closest thing we have to discerning what the general public thinks of any given video. I say “unfortunately” because I don’t believe the star rating system really does a very good job at accurately reflecting a person’s true opinion of a video, but as it’s all we have to go on I’ll do what I can with it.

To begin, the above graph shows the distribution of star ratings for videos on the .org. Only videos that have 100 or more star ratings are taken into account, with a few exceptions (i.e. really old videos that nobody’s going to come across unless they’re searching for them specifically which have ~70+ star ratings). Keep in mind that the star ratings entered into the database are the ratings at the time that I entered the video; a star rating may have changed slightly since the time a video has been entered, although hopefully not too much (this is the main reason I’m only taking into account videos with 100+ ratings).

The middle of the bell curve on this graph rests, perhaps unsurprisingly, at the 3.75-3.99 mark, indicating that people tend to over-rate videos. If we’re working on a 1-5 star rating system, the average should be around 3, however that’s not even close to what we observe. This doesn’t surprise me in the least, first of all because I think people tend to overrate things they like anyway, but more importantly, I’m sure my own bias has entered the mix. As an .org Donator, I get to see the star ratings for individual videos and I tend to shy away from things rated below 3.00. As such, the data in here is probably not 100% pure, or even close, so please keep that in mind.

06 star rating by year

Forgive the ugliness of this one — OpenOffice is somewhat limited in its abilities, so I had to kind of hack this one together. What this shows is the average star rating by year (the bars) with the total count of the number of star ratings per year in the background. You can use this to get a visual idea of how reliable any one of the readings is from a given year, relative to a reading from a different year. Thus, of the data collected, 2002’s star rating is the most reliable (i.e. closest to the true value) and 2010’s and 2012’s the least (not counting 2015 because there are no star ratings at all, so it’s a moot point).

There’s an interesting upward trend in this particular set of data, and I wonder if that is indicative of anything or (more likely) just due to a lack of data from later years.

07 star rating by length

At first glance, the above chart looks pretty definitive — the logical conclusion is that people tend to like longer videos. However, please notice the count of videos in the 0-60 second range, as well as in the 61-120 second range. Both are low enough that they can pretty safely be ignored (and you can probably knock out the 300+ second videos as well). So really, this doesn’t tell us too much that we couldn’t have guessed, although it is interesting to note that there is a comparable number of videos in the 61-120 range as in the 300+ range, so those two can be compared with some confidence, and the difference is pretty incredible — people definitely like the longer videos. Not what I would expect.

As time goes on though, I do expect the lower end of the length spectrum to come up a bit in terms of star rating average.

08 length by genre

A couple other video length-related graphs as well…the one above shows the average video length by video genre, and this one is quite interesting because for most genres, there’s a large enough volume of data, I think, that the differences can be pretty reliably compared.

A couple interesting things to draw from this — first, I’ve included “FXgasm” in here even though that’s a tag, and not a genre. Nevertheless, I was interested to see how long these videos averaged compared against others, and I find it surprising that it ranks so high on the list. I’d have thought it would be found much closer to the bottom, but so far…nope. Although, I will also note this; I made a post doing some early analysis many many months ago, and this was one of the things that I brought up. As you can see there, FXgasm videos were actually the longest videos by a considerable margin, however at that time only 53 were in the database. Since then, 37 more have been added and the average duration has decreased a full 23.3 seconds, which is a very nontrivial amount. So, we’ll likely see this average decrease as the Genome Project grows.

Second, all the “fun” videos populate the lower end of the spectrum, which makes sense as these videos typically make use of poppier, more to-the-point songs which average shorter lengths. I also think it’s a lot easier to justify cutting these songs short, and to not worry about creating a story or other things that might take more time to develop.

09 length over time

Finally, here is what is to me the most interesting and telling graph of them all. It shows the average length of videos over the last 15 years; the blue line is a trend line, the red shows the actual count at each year. I’ve been keeping my eye on this for a good 6+ months now, and it hasn’t changed, except that it’s gotten more extreme over time. I think it’s pretty safe to say with a measure of certainty that videos are getting shorter, on the whole. I don’t discriminate by video length when I download videos, so this is a pretty unbiased look into this particular set of data.

I do think that as time goes on, the downward trend may become somewhat less extreme, however I don’t expect it to even out completely. Newer videos are shorter, and I suspect this is the truth of the matter.

I have my own guesses as to why this is. I won’t go into too much detail here, as I want to collect more data before I start editorializing, but I think it has a lot to do with YouTube, electronic music, and the general ADD culture that seems to define the younger generations. However you look at it, it’s pretty fascinating if you ask me.

Personal ratings/analysis

Okay! Well, I think that’s enough of the objective data analysis for now. There are quite a few other graphs and charts that I haven’t mentioned here, because it would take too long, and it’s already pretty dry content as it is. But feel free to explore them yourself, they’re all in the spreadsheet.

However, since this is my spreadsheet that I’m using for my own organizational purposes, I have put a lot of work into it that details my own feelings on AMVs, in the form of rating videos on a scale of 0.5-10. This is mostly for myself, but I dunno, other people might find it interesting. Here’s a metrics-based look into what I think of AMVs:

12 rating avg

I’m happy with this — I tend to overrate things I like and jump into hyperbole without a second thought, AND with AMVs I tend to be really cynical and critical, so I was worried that I would average around either 4.00-4.50 or 7.25-7.50. An average of 6.72 is good — it means I’m much less cynical than I thought, while at the same time not being overly sunshiney about everything. I’m actually more positive than I thought I would be, and I think that’s a good thing! Still, the amount of crap I’ve had to watch…

10 my rating by year

An interesting graph simply because of the shape it takes. Good videos seem to come in ~6 year cycles, and I wonder why that is. I have no idea at this point, or even if this is a reliable graph in any way, but it’s cool to ponder.

Also, 2015 has been an unmitigated disaster so far. Hopefully the con season can redeem it somewhat.

11 my rating by genre

I also wanted to examine what types of videos I like the most. The fact that horror videos are at the top doesn’t surprise me; I’ve said before that horror videos, when done right, are some of the best out there. Although only 52 such videos are in the database, they’re pretty good, it would seem.

It does surprise me that romance videos are so far down on the list, as I tend to prefer romance videos over others, but maybe it’s because I watch more of them that I also tend to see a lot that aren’t so great. I don’t know.

Anyway, I’ll leave you with this: I’ve rated 21 videos in the spreadsheet at 10/10, and I’ll list them out for the sake of simplicity, along with .org links to check them out. I’d recommend them — they’re the videos that are, by my estimation anyway, the best of the best. I’m sure I’ll end up making individual posts about some of them as time goes on, but if you’re ever in the mood for some great AMVs, check these out:

aerialesque – Two Hearts
Alex Daikou – DANSU
aluminumstudios – Silence
Beowulf – Where Do We Go
DreamsofaCobra – Asuka’s Wonderland
ErMaC – Extraordinary World
evilspider – Dive
jbone – Something Wonderful
Kevin Caldwell – Caffeine Encomium
Koopiskeva – Damaged Rei-Mix
Koopiskeva – Momentum
Koopiskeva – Paper Image
Marc Hairston – Captain Nemo
Megamom – C o c k t a i l
Megamom – Sharing Light
Megamom – The spiral the girls up wake
Nostromo_vx – Binary Overdrive
Nostromo_vx – Galaxy Bounce
Nostromo_vx – Magic Pad
Scott A Melzer – All Star
WalterScott – Undiscovered Countries

Advertisements

About crakthesky

Mid-20s and vocal about my subculture.
This entry was posted in amv, genome project journal and tagged , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s