Date of Award

2015

Document Type

Thesis

Degree Name

Master of Science

College

College of Arts and Sciences

Program

Engineering & Computer Science, MS

First Advisor

Hyun Kwon

Abstract

Problem

The lack of a theory to explain human thought process latently affects the general perception of problem solving activities. This present study was to theorize human thought process (HTP) to ascertain in general the effect of problem solving inadequacy on efficiency.

Method

To theorize human thought process (HTP), basic human problem solving activities were investigated through the vein of problem-solution cycle (PSC). The scope of PSC investigation was focused on the inefficiency problem in software construction and latent characteristic efficiencies of a similar networked system. In order to analyze said PSC activities, three mathematical quotients and a messaging wavefunction model similar to Schrodinger’s electronic wavefunction model are respectively derived for four intrinsic brain traits namely intelligence, imagination, creativity and language. These were substantiated using appropriate empirical verifications. Firstly, statistical analysis of intelligence, imagination and creativity quotients was done using empirical data with global statistical views from: 1. 1994–2004 CHAOS report Standish Group International’s software development projects success and failure survey. 2. 2000–2009 Global Creativity Index (GCI) data based on 3Ts of economic development (technology, talent and tolerance indices) from 82 nations. 3. Other varied localized success and failure surveys from 1994–2009/1998–2010 respectively. These statistical analyses were done using spliced decision Sperner system (SDSS) to show that the averages of all empirical scientific data on successes and failures of software production within specified periods are in excellent agreement with theoretically derived values. Further, the catalytic effect of creativity (thought catalysis) in human thought process is outlined and shown to be in agreement with newly discovered branch-like nerve cells in brain of mice (similar to human brain). Secondly, the networked communication activities of the language trait during PSC was scrutinized statistical using journal-journal citation data from 13 randomly selected 1984 major chemistry journals. With the aid of aforementioned messaging wave formulation, computer simulation of message-phase “thermogram” and “chromatogram” were generated to provide messaging line spectra relative to the behavioral messaging activities of the messaging network under study.

Results

Theoretical computations stipulated 66.67% efficiency due to intelligence, imagination and creativity traits interactions (multi-computational skills) was 33.33% due to networked linkages of language trait (aggregated language skills). The worldwide software production and economic data used were normally distributed with significance level α of 0.005. Thus, there existed a permissible error of 1% attributed to the significance level of said normally distributed data. Of the brain traits quotient statistics, the imagination quotient (IMGQ) score was 52.53% from 1994-2004 CHAOS data analysis and that from 2010 GCI data was 54.55%. Their average reasonably approximated 50th percentile of the cumulative distribution of problem-solving skills. On the other hand, the creativity quotient score from 1994-2004 CHAOS data was 0.99% and that from 2010 GCI data was 1.17%. These averaged to a near 1%. The chances of creativity and intelligence working together as joint problem-solving skills was consistently found to average at 11.32%(1994-2004 CHAOS: 10.95%, 2010 GCI: 11.68%). Also, the empirical data analysis showed that the language inefficiency of thought flow ηʹ(τ) from 1994-2004 CHAOS data was 35.0977% and that for 2010 GCI data was 34.9482%. These averaged around 35%. On the success and failure of software production, statistical analysis of empirical data showed 63.2% average efficiency for successful software production (1994 - 2012) and 33.94% average inefficiency for failed software production (1998 - 2010). On the whole, software production projects had a bound efficiency approach level (BEAL) of 94.8%. In the messaging wave analysis of 13 journal-to-journal citations, the messaging phase space graph(s) indicated a fundamental frequency (probable minimum message state) of 11.

Conclusions

By comparison, using cutoff level of printed editions of Journal Citation Reports to substitute for missing data values is inappropriate. However, values from optimizing method(s) harmonized with the fundamental frequency inferred from message wave analysis using informatics wave equation analysis (IWEA). Due to its evenly spaced chronological data snapshot, the application of SDSS technique inherently does diminish the difficulty associated with handling large data volume (big data) for analysis. From CHAOS and GCI data analysis, the averaged CRTQ scores indicate that only 1 percent (on the average) of the entire human race can be considered exceptionally creative. However in the art of software production, the siphoning effect of existing latent language inefficiency suffocates its processes of solution creation to an efficiency bound level of 66.67%. With a BEAL value of 94.8% and basic human error of 5.2%, it can be reasonable said that software production projects have delivered efficiently within existing latent inefficiency. Consequently, by inference from the average language inefficiency of thought flow, an average language efficiency of 65% exists in the process of software production worldwide. Reasonably, this correlates very strongly with existing average software production efficiency of 63.2% around which software crisis has averagely stagnated since the inception of software creation. The persistent dismal performance of software production is attributable to existing central focus on the usage of multiplicity of programming languages. Acting as an “efficiency buffer”, the latter minimizes changes to efficiency in software production thereby limiting software production efficiency theoretically to 66.67%. From both theoretical and empirical perspective, this latently shrouds software production in a deficit maximum attainable efficiency (DMAE). Software crisis can only be improved drastically through policy-driven adaptation of a universal standard supporting very minimal number of programming languages. On the average, the proposed universal standardization could save the world an estimated 6 trillion US dollars per year which is lost through existing inefficient software industry.

Subject Area

Thought and thinking--Mathematical models, Software engineering, Computer software--Development

Creative Commons License

Creative Commons Attribution-No Derivative Works 4.0 International License
This work is licensed under a Creative Commons Attribution-No Derivative Works 4.0 International License.

DOI

https://dx.doi.org/10.32597/theses/83/

Files over 3MB may be slow to open. For best results, right-click and select "save as..."

Share

COinS