| From: | Joe Conway <mail(at)joeconway(dot)com> |
|---|---|
| To: | Pat Trainor <pat(dot)trainor(at)gmail(dot)com>, pgsql-general(at)lists(dot)postgresql(dot)org |
| Subject: | Re: How To: A large [2D] matrix, 100,000+ rows/columns |
| Date: | 2023-06-09 13:16:02 |
| Message-ID: | [email protected] |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
On 6/8/23 22:17, Pat Trainor wrote:
> Imagine something akin to stocks, where you have a row for every stock,
> and a column for every stock. Except where the same stock is the row &
> col, a number is at each X-Y (row/column), and that is the big picture.
> I need to have a very large matrix to maintain & query, and if not
> (1,600 column limit), then how could such data be broken down to work?
100,000 rows *
100,000 columns *
8 bytes (assuming float8)
= about 80 GB per matrix if I got the math correct.
Is this really a dense matrix or is it sparse? What kind of operations?
Does it really need to be stored as such or could it be stored as
vectors that are converted to a matrix on the fly when needed?
Seems like using python or R makes more sense. Perhaps it might make
sense to store the data in Postgres and use plpython or plr. But it is
hard to say with more details.
--
Joe Conway
PostgreSQL Contributors Team
RDS Open Source Databases
Amazon Web Services: https://aws.amazon.com
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Wim Bertels | 2023-06-09 14:00:27 | Re: How To: A large [2D] matrix, 100,000+ rows/columns |
| Previous Message | Ben Chobot | 2023-06-09 13:04:09 | Re: Active Active PostgreSQL Solution |