Seminal Seminars

Wide-audience Scientific Seminars of IST-Tagus


  • 17th of June

    Sensible Resource Management for the Future Internet
    by Rui Valadas

    In competitive markets, network operators must apply sensible resource  management procedures to provide the QoS contracted with users at  minimum cost. This requires measuring and characterizing the traffic,  evaluating the network performance and optimizing the traffic control  mechanisms and the dimensioning of the network capacity. Since the  characteristics of Internet traffic are becoming more and more  complex, traffic measurements (and what can be learned from them) will  play a critical in successfully enabling the future Internet.

    In this talk we will give an overview of the main issues pertaining to  the resource management of the future Internet, namely the traffic  control mechanisms, the performance evaluation techniques, the traffic  modeling and statistical characterization methods and the capacity  dimensioning procedures. Moreover, we will present a traffic  measurement system with a peer-to-peer architecture that has been  introduced recently to allow a more extensive use of measurements in  the resource management process.

  • 3rd of June

    CSI: are Mendel's data "too good to be true?"
    by Ana Pires

    Gregor Mendel (1822-1884) is almost unanimously recognized as the founder of modern genetics. However, long ago, a shadow of doubt was cast on his integrity by another eminent scientist, the statistician and geneticist, Sir Ronald Fisher (1890?1962),  who questioned the honesty of the data that form the core of Mendel's work.  This issue, nowadays called "the Mendel-Fisher controversy", can be traced back to 1911, when Fisher first presented his doubts about Mendel's results, though he only published a paper with his analysis of Mendel's data in 1936.

    A large number of papers have been published about this controversy culminating with the publication in 2008 of a book (Franklin et al., "Ending the Mendel-Fisher controversy") aiming at ending the issue, definitely rehabilitating Mendel's image. However, quoting from Franklin et al., "the issue of the  `too good to be true' aspect of Mendel's data found by Fisher still stands".

    We have submitted Mendel's data and Fisher's statistical analysis to extensive computations and simulations attempting to discover an hidden explanation or hint that could help finding an answer to the questions: is Fisher right or wrong, and if Fisher is right is there any reasonable explanation for the  "too good to be true", other than deliberate fraud? In this talk some results of this investigation and the conclusions obtained will be presented.

  • 20th of May

    Multivariate estimation of socioeconomic regional clusters
    by João Oliveira Soares

    This talk begins by discussing the potential of statistical cluster analysis for analyzing regional policies and for supporting regional
    development initiatives. Two cases involving the European Union Regional Policy illustrate this discussion. The rest of the talk summarizes the methodological issues related to the multivariate estimation of socioeconomic regional clusters, namely, the need for data reduction, and the choice of variables and clustering algorithm. A case study involving European socioeconomic indicators from the Eurostat Database Regio is used to compare the variability of the results caused either by the variable or the clustering method selection.

  • 6th of May

    Organizational Design and Engineering: How to survive organizational engineering as an engineer
    by António Rito Silva

    An organization is manifested by the set of interactions among its members. Today, most of these interactions are mediated by computer-based artifacts. Current state-of-art in organizational design identifies an entangling between people and technology which precludes the separate study of people and technology:  computer-based artifacts shape organizations, they embody an organizational design and engender an organization through their use. The question we intend to address in this seminar is:  Should the organization become the object of engineering activities?
    We propose design and implementation of computer-based artifacts as the focus of engineering activities, but considering the organizational aspects associated with their design and development.  During the presentation, organizational qualities of computer-based artifacts, such as incompleteness, will be discussed and the development process of computer-based artifacts will be revised as a knowledge creation process where the distinction between designers and users is blurred. The design and implementation of business processes will be used to illustrate the proposal.

  • 25th of March

    Technical Analyses: how to predict Stock Market future prices based in past prices
    by Rui Neves

    Technical analyses says that it is possible to predict the future price of stocks and commodities based only on the price history of the active. In contrast, the Efficient Market Hipothesys (EMH) states that markets are efficient and that all the information is immediately and rationally discounted in the market. The decrease in the last year of stock market prices has shown that EMH must be wrong.

    In this talk we will show that stock prices have a tendency to trend in one direction up or down, and to repeat certain patterns. These patterns have curious names, like "head and shoulders", "double top", "descending triangle" and "diamond top", among others. Pattern recognition can be used in this way to predict future prices. Moving averages and oscillators are other techniques used in this area. Also genetic algorithms can be used to find the best parameters.

  • 11st of March

    Why factoring is important in security
    by Paulo Mateus

    We will overview relevant modern approaches in factorization. We start by showing why factoring is important in security. Then, we give an overview of Shor's quantum algorithm assuming no expertize from the audience. Finally, we will show how to reduce factorization to the problem of numerical integration.

  • 7th of January

    Hacking life: how to build a new life form in your computer
    by Arlindo Oliveira

    Synthetic biology is a new field of research that combines computer models of biological systems with DNA synthesis and genetic engineering techniques in order to design and build new biological functions, systems and organisms. While still in its infancy, this area of research is expected to develop rapidly, so that very soon researchers, companies and hackers will be able to design, build and release in the wild new organisms. In this talk, I will address some questions and challenges posed by this technology, and, in particular, the role that will be played by research areas such as Systems Biology, Bioinformatics and Information Systems in the design of artificial life forms.

  • 10th of December

    The laughs and tears of affective computers

    by Ana Paiva

    In recent years, scientists unraveling the mysteries of human intelligence are drawn into searching for better understandings of how the human brain works. In that pursuit, new findings and new theories point out the importance of emotions and affect in human decision making. As such, when creating computational models of intelligence, one of the core goals of the area of Artificial Intelligence, emotions and affect have started to play an important role. Further, as computers interact with humans in an ever more natural and human inspired way, such interactions ought to take the affective elements into account.

    One of the very many ways by which recent advances in affective neuroscience has influenced the technology stands in a new area of computing named "Affective Computing". The term "Affective Computing" originated by Rosalind Picard more than a decade ago, refers to the "computing that relates to, arises from, or deliberatively influences emotions". But how can emotions influence  computing? How can computers understand the subtle aspects of emotions in human users? This talk will discuss some of the questions that drive the area of affective computing, such as "how can computers understand the emotions of the users?", "how can computers make decisions that are fast and inspired in affective human decision making?" or "how can computers express emotions?".


    [P] R. W. Picard, “Affective Computing”, MIT Press (1997).

  • 19th of November

    Control of edge turbulence in tokamak plasmas

    by Carlos Silva

    Turbulence has since long been found to be the dominant reason for the observed limited confinement in fusion devices. The effect of turbulence in plasmas is the increase of matter and energy transport. In the operating conditions of the tokamaks, the combined effect of electric and magnetic fields fluctuations is the cause of large losses.

    One of the success stories of magnetic fusion research over the past decade is the discover of techniques to reduce turbulence and the simultaneously development of what is called the flow shear stabilization model to explain how those techniques work. Experimental tools have been developed to control the plasma flow and their importance on the global confinement has been demonstrated in a wide range of plasma conditions.


    [SFNGV] C. Silva, H. Figueiredo, I. Nedzelskiy, B. Gonçalves and C.A.F. Varandas, “Control of the edge turbulent transport by emissive electrode biasing on the tokamak ISTTOK”, Plasma Phys. Control. Fusion 48 (2006) 727.

  • 5th of November

    IT Management       
    by Miguel Mira da Silva

    Despite the dramatic evolution of Information Technologies (IT) in the last 35 years, the current challenges regarding IT management are interestingly similar: how to "align" IT and business. In 1973, Nolan had already mentioned maturity levels as a path towards business/IT alignment. But we are now in 2008 still trying to achieve the same goal...

    The fundamental issue is to find out the *root cause* for this problem, which IMHO is based on the exagerated technical background found on most IT degrees. For that reason, I do believe that the problem is about to be solved in the next few years as new IT students graduate with less -- or even no -- technical background.

    To demonstrate my point of view I will present the great contribution of Prince Henry, the Navigator (to whom I dedicate my latest book) to IT management. Just remember that Henry -- so-called "the Navigator" -- did not know how to navigate. Why then should IT managers know a lot about IT?


    [N] R. L. Nolan, Managing the computer resource: a stage hypothesism, Commun. ACM 16, 7 (Jul. 1973), 399-405.

  • 22nd of October

    Robust regression: an added value for a fundamental model  
    by João Branco

    Since the last part of the 19th century regression has been the everyday bread of a vast population of users of statistics working in a large variety of fields. The least squares method has been the butter that prepares the ingestion and facilitates the study of the model.

    Here it is shown that the bread and butter of statistics may not be as deliciously tasteful as its voracious consumption appears to suggest. The recipe may fail and the taste may turn bitter, which means that the process may lead to misleading results.

    To prevent inconveniences robust regression is proposed. This is a more satisfying method that protects all the users of the weaknesses of least squares. The central objective of the seminar is the study of robust regression. Its advantages, limitations, popularity and main methods to induce robustness in regression will be the main topics to be discussed.


    [RY]  P. J. Rousseeuw and V. J. Yohai. Robust regression by means of S-estimators. In Robust and Nonlinear Time Series Analysis. New York Springer, (1984) 256 - 272.

    [SY]  M. Salibian-Barrera and V. J. Yohai. A Fast Algorithm for S-Regression Estimates. Journal of Computational & Graphical Statistics, (2006) 15, 2, 414-427.


If you would like to contribute with a presentation or sugest a speaker to be invited, contact ...