The task was divided into 3 steps:
1) First step was to map what is persisted by each of the application's features.
2) Next step was to estimate the average use of the application, that is, how many times each of the features would be used in a month.
3) Knowing these information, we simply had to estimate the average row size for each of the tables and multiply by the number of records estimated in the previous 2 steps. In Oracle, this can be easily achieved by querying the dba_tables after analyzing the tables:
-- Analyze the tables and generate statistics for them (including average row size)
ANALYZE TABLE <table name> COMPUTE STATISTICS;
-- Get the generated average row size
select OWNER, TABLE_NAME, AVG_ROW_LEN from dba_tables where OWNER=<schema owner> order by TABLE_NAME;
If you want to be super cautious you could use the maximum row size instead of the average row size.
The average row size depends on the data you have inserted on your database, so it could be terribly wrong if your data sample is small or badly distributed. The maximum row size, on the other hand, gives you the maximum size a row can take and thus can be used to estimate the size of your database in a more conservative manner.
To get the maximum row size for all tables of an user in Oracle:
-- MAX_ROW_SIZE
select
table_name ,
sum(data_length)
from
user_tab_columns
group by
table_name