• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

Db table with most records you have worked with

Collapse
X
  •  
  • Filter
  • Time
  • Show
Clear All
new posts

    Db table with most records you have worked with

    Hi all
    I have just picke myself up from my chair
    On doing a bit of analysis of client co's system I fins a table with quite a high number of records:
    2.023.740.108

    Un bloody believable


    What is your record?
    "Condoms should come with a free pack of earplugs."

    #2
    Sure you never pinged for an IP address?

    Comment


      #3
      This table has a few records.

      Comment


        #4
        Originally posted by minestrone View Post
        Sure you never pinged for an IP address?
        True, he's not the sharpest knife in the drawer.
        Hard Brexit now!
        #prayfornodeal

        Comment


          #5
          About 4 billion, although it was a materialised view consisting of a 4 way cartesian self join and a couple of other little tables. I did try and warn them it wasn't a good idea.
          While you're waiting, read the free novel we sent you. It's a Spanish story about a guy named 'Manual.'

          Comment


            #6
            I worked on a schema for a major retail bank which consisted of a 3 table model of type, attribute & data. Oracle had to come in when performance somehow unsurprisingly struggled and claimed it was the biggest table size they had ever seen.
            Last edited by minestrone; 9 September 2011, 11:18.

            Comment


              #7
              Originally posted by minestrone View Post
              I worked on a schema which for a major retail bank which consisted of a 3 table model of type, attribute & data. Oracle had to come in when performance somehow unsurprisingly struggled and claimed it was the biggest table size they had ever seen.
              Oracle used something similar for early versions of oracle workflow. Oh how we laughed when the people replacing our "pilot" system decided to model 4000 attributes directly (including such niceties as site_1_address, site_2_address, site_3_address, site_4_address) rather than using the order number as a route into a proper data model. 100,000 orders, 400 million rows (this was on fairly low end sun hardware, our system ran happily on a sun ultra something with 8 disks IIRC) and slightly less than stellar interactive performance. What's that Bob, you need to go back to the drawing board? Do you want to borrow my crayons?
              While you're waiting, read the free novel we sent you. It's a Spanish story about a guy named 'Manual.'

              Comment


                #8
                The table holding the complete known list of prime numbers must be getting pretty large.

                At least it only requires one column.

                Largest known prime number - Wikipedia, the free encyclopedia

                The record passed one million digits in 1999, earning a $50,000 prize.[4] In 2008 the record passed ten million digits, earning a $100,000 prize.[5] Additional prizes are being offered for the first prime number found with at least one hundred million digits and the first with at least one billion digits.

                I spot a plan B. Anyone done a p2p prime number generator app yet?
                Feist - 1234. One camera, one take, no editing. Superb. How they did it
                Feist - I Feel It All
                Feist - The Bad In Each Other (Later With Jools Holland)

                Comment


                  #9
                  Originally posted by doodab View Post
                  Oracle used something similar for early versions of oracle workflow. Oh how we laughed when the people replacing our "pilot" system decided to model 4000 attributes directly (including such niceties as site_1_address, site_2_address, site_3_address, site_4_address) rather than using the order number as a route into a proper data model. 100,000 orders, 400 million rows (this was on fairly low end sun hardware, our system ran happily on a sun ultra something with 8 disks IIRC) and slightly less than stellar interactive performance. What's that Bob, you need to go back to the drawing board? Do you want to borrow my crayons?
                  There was a great drive in the industry against hard coding anything for a few years to the point where objects and tables were completely abstracted away from being what they represent. Stick users, accounts, addresses, transaction into one table of type attribute data columns.

                  Nobody had a clue what the software does on these systems, you cannot look at the code and work out what is going to happen because it is all run time based.

                  Sticking a break point on setName was never an option when a name attribute got fecked up, you have to stick a break point on setAttribute and drift through countless cycles of the method.

                  Comment


                    #10
                    There's a load into the Data Warehouse of a few hundred million records every year - fortunately only a handful of fields and it is being summarised. The source, however, never gets trimmed down and must hold several billion records of ~50 fields.

                    I've heard the health insurance companies here are already running databases in the Petabyte range.
                    Down with racism. Long live miscegenation!

                    Comment

                    Working...
                    X